Sample records for model evaluation tool

  1. THE ATMOSPHERIC MODEL EVALUATION TOOL

    EPA Science Inventory

    This poster describes a model evaluation tool that is currently being developed and applied for meteorological and air quality model evaluation. The poster outlines the framework and provides examples of statistical evaluations that can be performed with the model evaluation tool...

  2. THE ATMOSPHERIC MODEL EVALUATION TOOL (AMET); AIR QUALITY MODULE

    EPA Science Inventory

    This presentation reviews the development of the Atmospheric Model Evaluation Tool (AMET) air quality module. The AMET tool is being developed to aid in the model evaluation. This presentation focuses on the air quality evaluation portion of AMET. Presented are examples of the...

  3. Agent-based modeling as a tool for program design and evaluation.

    PubMed

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Bayesian cross-validation for model evaluation and selection, with application to the North American Breeding Bird Survey

    USGS Publications Warehouse

    Link, William; Sauer, John R.

    2016-01-01

    The analysis of ecological data has changed in two important ways over the last 15 years. The development and easy availability of Bayesian computational methods has allowed and encouraged the fitting of complex hierarchical models. At the same time, there has been increasing emphasis on acknowledging and accounting for model uncertainty. Unfortunately, the ability to fit complex models has outstripped the development of tools for model selection and model evaluation: familiar model selection tools such as Akaike's information criterion and the deviance information criterion are widely known to be inadequate for hierarchical models. In addition, little attention has been paid to the evaluation of model adequacy in context of hierarchical modeling, i.e., to the evaluation of fit for a single model. In this paper, we describe Bayesian cross-validation, which provides tools for model selection and evaluation. We describe the Bayesian predictive information criterion and a Bayesian approximation to the BPIC known as the Watanabe-Akaike information criterion. We illustrate the use of these tools for model selection, and the use of Bayesian cross-validation as a tool for model evaluation, using three large data sets from the North American Breeding Bird Survey.

  5. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    EPA Pesticide Factsheets

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  6. Model Evaluation of Continuous Data Pharmacometric Models: Metrics and Graphics

    PubMed Central

    Nguyen, THT; Mouksassi, M‐S; Holford, N; Al‐Huniti, N; Freedman, I; Hooker, AC; John, J; Karlsson, MO; Mould, DR; Pérez Ruixo, JJ; Plan, EL; Savic, R; van Hasselt, JGC; Weber, B; Zhou, C; Comets, E

    2017-01-01

    This article represents the first in a series of tutorials on model evaluation in nonlinear mixed effect models (NLMEMs), from the International Society of Pharmacometrics (ISoP) Model Evaluation Group. Numerous tools are available for evaluation of NLMEM, with a particular emphasis on visual assessment. This first basic tutorial focuses on presenting graphical evaluation tools of NLMEM for continuous data. It illustrates graphs for correct or misspecified models, discusses their pros and cons, and recalls the definition of metrics used. PMID:27884052

  7. Development and validation of a nursing professionalism evaluation model in a career ladder system.

    PubMed

    Kim, Yeon Hee; Jung, Young Sun; Min, Ja; Song, Eun Young; Ok, Jung Hui; Lim, Changwon; Kim, Kyunghee; Kim, Ji-Su

    2017-01-01

    The clinical ladder system categorizes the degree of nursing professionalism and rewards and is an important human resource tool for managing nursing. We developed a model to evaluate nursing professionalism, which determines the clinical ladder system levels, and verified its validity. Data were collected using a clinical competence tool developed in this study, and existing methods such as the nursing professionalism evaluation tool, peer reviews, and face-to-face interviews to evaluate promotions and verify the presented content in a medical institution. Reliability and convergent and discriminant validity of the clinical competence evaluation tool were verified using SmartPLS software. The validity of the model for evaluating overall nursing professionalism was also analyzed. Clinical competence was determined by five dimensions of nursing practice: scientific, technical, ethical, aesthetic, and existential. The structural model explained 66% of the variance. Clinical competence scales, peer reviews, and face-to-face interviews directly determined nursing professionalism levels. The evaluation system can be used for evaluating nurses' professionalism in actual medical institutions from a nursing practice perspective. A conceptual framework for establishing a human resources management system for nurses and a tool for evaluating nursing professionalism at medical institutions is provided.

  8. A GIS Tool for evaluating and improving NEXRAD and its application in distributed hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Srinivasan, R.

    2008-12-01

    In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.

  9. 77 FR 4638 - Defense Federal Acquisition Regulation Supplement; Performance-Based Payments (DFARS Case 2011-D045)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-30

    ... tool. The PBP analysis tool is a cash-flow model for evaluating alternative financing arrangements, and... PBP analysis tool is a cash-flow model for evaluating alternative financing arrangements, and is... that reflects adequate consideration to the Government for the improved contractor cash flow...

  10. An Object-Based Approach to Evaluation of Climate Variability Projections and Predictions

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.

    2017-12-01

    Evaluations of the performance of earth system model predictions and projections are of critical importance to enhance usefulness of these products. Such evaluations need to address specific concerns depending on the system and decisions of interest; hence, evaluation tools must be tailored to inform about specific issues. Traditional approaches that summarize grid-based comparisons of analyses and models, or between current and future climate, often do not reveal important information about the models' performance (e.g., spatial or temporal displacements; the reason behind a poor score) and are unable to accommodate these specific information needs. For example, summary statistics such as the correlation coefficient or the mean-squared error provide minimal information to developers, users, and decision makers regarding what is "right" and "wrong" with a model. New spatial and temporal-spatial object-based tools from the field of weather forecast verification (where comparisons typically focus on much finer temporal and spatial scales) have been adapted to more completely answer some of the important earth system model evaluation questions. In particular, the Method for Object-based Diagnostic Evaluation (MODE) tool and its temporal (three-dimensional) extension (MODE-TD) have been adapted for these evaluations. More specifically, these tools can be used to address spatial and temporal displacements in projections of El Nino-related precipitation and/or temperature anomalies, ITCZ-associated precipitation areas, atmospheric rivers, seasonal sea-ice extent, and other features of interest. Examples of several applications of these tools in a climate context will be presented, using output of the CESM large ensemble. In general, these tools provide diagnostic information about model performance - accounting for spatial, temporal, and intensity differences - that cannot be achieved using traditional (scalar) model comparison approaches. Thus, they can provide more meaningful information that can be used in decision-making and planning. Future extensions and applications of these tools in a climate context will be considered.

  11. Evaluating models of healthcare delivery using the Model of Care Evaluation Tool (MCET).

    PubMed

    Hudspeth, Randall S; Vogt, Marjorie; Wysocki, Ken; Pittman, Oralea; Smith, Susan; Cooke, Cindy; Dello Stritto, Rita; Hoyt, Karen Sue; Merritt, T Jeanne

    2016-08-01

    Our aim was to provide the outcome of a structured Model of Care (MoC) Evaluation Tool (MCET), developed by an FAANP Best-practices Workgroup, that can be used to guide the evaluation of existing MoCs being considered for use in clinical practice. Multiple MoCs are available, but deciding which model of health care delivery to use can be confusing. This five-component tool provides a structured assessment approach to model selection and has universal application. A literature review using CINAHL, PubMed, Ovid, and EBSCO was conducted. The MCET evaluation process includes five sequential components with a feedback loop from component 5 back to component 3 for reevaluation of any refinements. The components are as follows: (1) Background, (2) Selection of an MoC, (3) Implementation, (4) Evaluation, and (5) Sustainability and Future Refinement. This practical resource considers an evidence-based approach to use in determining the best model to implement based on need, stakeholder considerations, and feasibility. ©2015 American Association of Nurse Practitioners.

  12. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  13. Improving smoothing efficiency of rigid conformal polishing tool using time-dependent smoothing evaluation model

    NASA Astrophysics Data System (ADS)

    Song, Chi; Zhang, Xuejun; Zhang, Xin; Hu, Haifei; Zeng, Xuefeng

    2017-06-01

    A rigid conformal (RC) lap can smooth mid-spatial-frequency (MSF) errors, which are naturally smaller than the tool size, while still removing large-scale errors in a short time. However, the RC-lap smoothing efficiency performance is poorer than expected, and existing smoothing models cannot explicitly specify the methods to improve this efficiency. We presented an explicit time-dependent smoothing evaluation model that contained specific smoothing parameters directly derived from the parametric smoothing model and the Preston equation. Based on the time-dependent model, we proposed a strategy to improve the RC-lap smoothing efficiency, which incorporated the theoretical model, tool optimization, and efficiency limit determination. Two sets of smoothing experiments were performed to demonstrate the smoothing efficiency achieved using the time-dependent smoothing model. A high, theory-like tool influence function and a limiting tool speed of 300 RPM were o

  14. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  15. Validation of a Tool Evaluating Educational Apps for Smart Education

    ERIC Educational Resources Information Center

    Lee, Jeong-Sook; Kim, Sung-Wan

    2015-01-01

    The purpose of this study is to develop and validate an evaluation tool of educational apps for smart education. Based on literature reviews, a potential model for evaluating educational apps was suggested. An evaluation tool consisting of 57 survey items was delivered to 156 students in middle and high schools. An exploratory factor analysis was…

  16. An Evaluation Tool for CONUS-Scale Estimates of Components of the Water Balance

    NASA Astrophysics Data System (ADS)

    Saxe, S.; Hay, L.; Farmer, W. H.; Markstrom, S. L.; Kiang, J. E.

    2016-12-01

    Numerous research groups are independently developing data products to represent various components of the water balance (e.g. runoff, evapotranspiration, recharge, snow water equivalent, soil moisture, and climate) at the scale of the conterminous United States. These data products are derived from a range of sources, including direct measurement, remotely-sensed measurement, and statistical and deterministic model simulations. An evaluation tool is needed to compare these data products and the components of the water balance they contain in order to identify the gaps in the understanding and representation of continental-scale hydrologic processes. An ideal tool will be an objective, universally agreed upon, framework to address questions related to closing the water balance. This type of generic, model agnostic evaluation tool would facilitate collaboration amongst different hydrologic research groups and improve modeling capabilities with respect to continental-scale water resources. By adopting a comprehensive framework to consider hydrologic modeling in the context of a complete water balance, it is possible to identify weaknesses in process modeling, data product representation and regional hydrologic variation. As part of its National Water Census initiative, the U.S. Geological survey is facilitating this dialogue to developing prototype evaluation tools.

  17. Modeling Electrostatic Fields Generated by Internal Charging of Materials in Space Radiation Environments

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.

    2011-01-01

    Internal charging is a risk to spacecraft in energetic electron environments. DICTAT, NU MIT computational codes are the most widely used engineering tools for evaluating internal charging of insulator materials exposed to these environments. Engineering tools are designed for rapid evaluation of ESD threats, but there is a need for more physics based models for investigating the science of materials interactions with energetic electron environments. Current tools are limited by the physics included in the models and ease of user implementation .... additional development work is needed to improve models.

  18. Presenting an Evaluation Model for the Cancer Registry Software.

    PubMed

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  19. Logic Models: A Tool for Designing and Monitoring Program Evaluations. REL 2014-007

    ERIC Educational Resources Information Center

    Lawton, Brian; Brandon, Paul R.; Cicchinelli, Louis; Kekahio, Wendy

    2014-01-01

    introduction to logic models as a tool for designing program evaluations defines the major components of education programs--resources, activities, outputs, and short-, mid-, and long-term outcomes--and uses an example to demonstrate the relationships among them. This quick…

  20. Data-Driven Modeling and Rendering of Force Responses from Elastic Tool Deformation

    PubMed Central

    Rakhmatov, Ruslan; Ogay, Tatyana; Jeon, Seokhee

    2018-01-01

    This article presents a new data-driven model design for rendering force responses from elastic tool deformation. The new design incorporates a six-dimensional input describing the initial position of the contact, as well as the state of the tool deformation. The input-output relationship of the model was represented by a radial basis functions network, which was optimized based on training data collected from real tool-surface contact. Since the input space of the model is represented in the local coordinate system of a tool, the model is independent of recording and rendering devices and can be easily deployed to an existing simulator. The model also supports complex interactions, such as self and multi-contact collisions. In order to assess the proposed data-driven model, we built a custom data acquisition setup and developed a proof-of-concept rendering simulator. The simulator was evaluated through numerical and psychophysical experiments with four different real tools. The numerical evaluation demonstrated the perceptual soundness of the proposed model, meanwhile the user study revealed the force feedback of the proposed simulator to be realistic. PMID:29342964

  1. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  2. ECO-DRIVING MODELING ENVIRONMENT

    DOT National Transportation Integrated Search

    2015-11-01

    This research project aims to examine the eco-driving modeling capabilities of different traffic modeling tools available and to develop a driver-simulator-based eco-driving modeling tool to evaluate driver behavior and to reliably estimate or measur...

  3. The effective integration of analysis, modeling, and simulation tools.

    DOT National Transportation Integrated Search

    2013-08-01

    The need for model integration arises from the recognition that both transportation decisionmaking and the tools supporting it continue to increase in complexity. Many strategies that agencies evaluate require using tools that are sensitive to supply...

  4. Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools

    EPA Science Inventory

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...

  5. Comparison of four modeling tools for the prediction of potential distribution for non-indigenous weeds in the United States

    USGS Publications Warehouse

    Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony

    2018-01-01

    This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.

  6. Updates to the CMAQ Post Processing and Evaluation Tools for 2016

    EPA Science Inventory

    In the spring of 2016, the evaluation tools distributed with the CMAQ model code were updated and new tools were added to the existing set of tools. Observation data files, compatible with the AMET software, were also made available on the CMAS website for the first time with the...

  7. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  8. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  9. The development of a plant risk evaluation (PRE) tool for assessing the invasive potential of ornamental plants.

    PubMed

    Conser, Christiana; Seebacher, Lizbeth; Fujino, David W; Reichard, Sarah; DiTomaso, Joseph M

    2015-01-01

    Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when "needs further evaluation" classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When "needs further evaluation" classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program.

  10. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool; (2) a low fidelity simulator development tool; (3) a dynamic, interactive interface between the HCI and the simulator; and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  11. Rule based design of conceptual models for formative evaluation

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.; Chang, Kai; Hale, Joseph P.; Bester, Terri; Rix, Thomas; Wang, Yaowen

    1994-01-01

    A Human-Computer Interface (HCI) Prototyping Environment with embedded evaluation capability has been investigated. This environment will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. This environment, which allows for rapid prototyping and evaluation of graphical interfaces, includes the following four components: (1) a HCI development tool, (2) a low fidelity simulator development tool, (3) a dynamic, interactive interface between the HCI and the simulator, and (4) an embedded evaluator that evaluates the adequacy of a HCI based on a user's performance. The embedded evaluation tool collects data while the user is interacting with the system and evaluates the adequacy of an interface based on a user's performance. This paper describes the design of conceptual models for the embedded evaluation system using a rule-based approach.

  12. Stochastic Simulation Tool for Aerospace Structural Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  13. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Kunst, O.; Cubasch, U.

    2014-12-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: guest password: miklip

  14. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced by other users-saving CPU time, I/O and disk space. This study presents the different techniques and advantages of such a hybrid evaluation system making use of a Big Data HPC in climate science. website: www-miklip.dkrz.de visitor-login: click on "Guest"

  15. Evaluating the State of Water Management in the Rio Grande/Bravo Basin

    NASA Astrophysics Data System (ADS)

    Ortiz Partida, Jose Pablo; Sandoval-Solis, Samuel; Diaz Gomez, Romina

    2017-04-01

    Water resource modeling tools have been developed for many different regions and sub-basins of the Rio Grande/Bravo (RGB). Each of these tools has specific objectives, whether it is to explore drought mitigation alternatives, conflict resolution, climate change evaluation, tradeoff and economic synergies, water allocation, reservoir operations, or collaborative planning. However, there has not been an effort to integrate different available tools, or to link models developed for specific reaches into a more holistic watershed decision-support tool. This project outlines promising next steps to meet long-term goals of improved decision support tools and modeling. We identify, describe, and synthesize water resources management practices in the RGB basin and available water resources models and decision support tools that represent the RGB and the distribution of water for human and environmental uses. The extent body of water resources modeling is examined from a perspective of environmental water needs and water resources management and thereby allows subsequent prioritization of future research and monitoring needs for the development of river system modeling tools. This work communicates the state of the RGB science to diverse stakeholders, researchers, and decision-makers. The products of this project represent a planning tool to support an integrated water resources management framework to maximize economic and social welfare without compromising vital ecosystems.

  16. The Air Quality Model Evaluation International Initiative ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  17. Prediction Of Abrasive And Diffusive Tool Wear Mechanisms In Machining

    NASA Astrophysics Data System (ADS)

    Rizzuti, S.; Umbrello, D.

    2011-01-01

    Tool wear prediction is regarded as very important task in order to maximize tool performance, minimize cutting costs and improve the quality of workpiece in cutting. In this research work, an experimental campaign was carried out at the varying of cutting conditions with the aim to measure both crater and flank tool wear, during machining of an AISI 1045 with an uncoated carbide tool P40. Parallel a FEM-based analysis was developed in order to study the tool wear mechanisms, taking also into account the influence of the cutting conditions and the temperature reached on the tool surfaces. The results show that, when the temperature of the tool rake surface is lower than the activation temperature of the diffusive phenomenon, the wear rate can be estimated applying an abrasive model. In contrast, in the tool area where the temperature is higher than the diffusive activation temperature, the wear rate can be evaluated applying a diffusive model. Finally, for a temperature ranges within the above cited values an adopted abrasive-diffusive wear model furnished the possibility to correctly evaluate the tool wear phenomena.

  18. Using Optimization to Improve Test Planning

    DTIC Science & Technology

    2017-09-01

    friendly and to display the output differently, the test and evaluation test schedule optimization model would be a good tool for the test and... evaluation schedulers. 14. SUBJECT TERMS schedule optimization, test planning 15. NUMBER OF PAGES 223 16. PRICE CODE 17. SECURITY CLASSIFICATION OF...make the input more user-friendly and to display the output differently, the test and evaluation test schedule optimization model would be a good tool

  19. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    ERIC Educational Resources Information Center

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-01-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school…

  20. New Automotive Air Conditioning System Simulation Tool Developed in MATLAB/Simulink

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiss, T.; Chaney, L.; Meyer, J.

    Further improvements in vehicle fuel efficiency require accurate evaluation of the vehicle's transient total power requirement. When operated, the air conditioning (A/C) system is the largest auxiliary load on a vehicle; therefore, accurate evaluation of the load it places on the vehicle's engine and/or energy storage system is especially important. Vehicle simulation software, such as 'Autonomie,' has been used by OEMs to evaluate vehicles' energy performance. A transient A/C simulation tool incorporated into vehicle simulation models would also provide a tool for developing more efficient A/C systems through a thorough consideration of the transient A/C system performance. The dynamic systemmore » simulation software Matlab/Simulink was used to develop new and more efficient vehicle energy system controls. The various modeling methods used for the new simulation tool are described in detail. Comparison with measured data is provided to demonstrate the validity of the model.« less

  1. Evaluation of a computerized aid for creating human behavioral representations of human-computer interaction.

    PubMed

    Williams, Kent E; Voigt, Jeffrey R

    2004-01-01

    The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.

  2. What's new in the Atmospheric Model Evaluation Tool (AMET) version 1.3

    EPA Science Inventory

    A new version of the Atmospheric Model Evaluation Tool (AMET) has been released. The new version of AMET, version 1.3 (AMETv1.3), contains a number of updates and changes from the previous of version of AMET (v1.2) released in 2012. First, the Perl scripts used in the previous ve...

  3. Farm simulation: a tool for evaluating the mitigation of greenhouse gas emissions and the adaptation of dairy production to climate change

    USDA-ARS?s Scientific Manuscript database

    Process-level modeling at the farm scale provides a tool for evaluating both strategies for mitigating greenhouse gas emissions and strategies for adapting to climate change. The Integrated Farm System Model (IFSM) simulates representative crop, beef or dairy farms over many years of weather to pred...

  4. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  5. Evaluation of Seeds of Science/Roots of Reading: Effective Tools for Developing Literacy through Science in the Early Grades-Light Energy Unit. CRESST Report 781

    ERIC Educational Resources Information Center

    Goldschmidt, Pete; Jung, Hyekyung

    2011-01-01

    This evaluation focuses on the Seeds of Science/Roots of Reading: Effective Tools for Developing Literacy through Science in the Early Grades ("Seeds/Roots") model of science-literacy integration. The evaluation is based on a cluster randomized design of 100 teachers, half of which were in the treatment group. Multi-level models are employed to…

  6. Overview of EPA tools for supporting local-, state- and regional-level decision makers addressing energy and environmental issues: NYC MARKAL Energy Systems Model and Municipal Solid Waste Decision Support Tool

    EPA Science Inventory

    A workshop will be conducted to demonstrate and focus on two decision support tools developed at EPA/ORD: 1. Community-scale MARKAL model: an energy-water technology evaluation tool and 2. Municipal Solid Waste Decision Support Tool (MSW DST). The Workshop will be part of Southea...

  7. A GIS-based approach for the screening assessment of noise and vibration impacts from transit projects.

    PubMed

    Hamed, Maged; Effat, Waleed

    2007-08-01

    Urban transportation projects are essential in increasing the efficiency of moving people and goods within a city, and between cities. Environmental impacts from such projects must be evaluated and mitigated, as applicable. Spatial modeling is a valuable tool for quantifying the potential level of environmental consequences within the context of an environmental impact assessment (EIA) study. This paper presents a GIS-based tool for the assessment of airborne-noise and ground-borne vibration from public transit systems, and its application to an actual project. The tool is based on the US Federal Transit Administration's (FTA) approach, and incorporates spatial information, satellite imaging, geostatistical modeling, and software programming. The tool is applied on a case study of initial environmental evaluation of a light rail transit project in an urban city in the Middle East, to evaluate alternative layouts. The tool readily allowed the alternative evaluation and the results were used as input to a multi-criteria analytic framework.

  8. The Use of AMET and Automated Scripts for Model Evaluation

    EPA Science Inventory

    The Atmospheric Model Evaluation Tool (AMET) is a suite of software designed to facilitate the analysis and evaluation of meteorological and air quality models. AMET matches the model output for particular locations to the corresponding observed values from one or more networks ...

  9. The Visible Signature Modelling and Evaluation ToolBox

    DTIC Science & Technology

    2008-12-01

    Technology Organisation DSTO–TR–2212 ABSTRACT A new software suite, the Visible Signature ToolBox ( VST ), has been developed to model and evaluate the...visible signatures of maritime platforms. The VST is a collection of commercial, off-the-shelf software and DSTO developed pro- grams and procedures. The...suite. The VST can be utilised to model and assess visible signatures of maritime platforms. A number of examples are presented to demonstrate the

  10. The Development of a Plant Risk Evaluation (PRE) Tool for Assessing the Invasive Potential of Ornamental Plants

    PubMed Central

    Conser, Christiana; Seebacher, Lizbeth; Fujino, David W.; Reichard, Sarah; DiTomaso, Joseph M.

    2015-01-01

    Weed Risk Assessment (WRA) methods for evaluating invasiveness in plants have evolved rapidly in the last two decades. Many WRA tools exist, but none were specifically designed to screen ornamental plants prior to being released into the environment. To be accepted as a tool to evaluate ornamental plants for the nursery industry, it is critical that a WRA tool accurately predicts non-invasiveness without falsely categorizing them as invasive. We developed a new Plant Risk Evaluation (PRE) tool for ornamental plants. The 19 questions in the final PRE tool were narrowed down from 56 original questions from existing WRA tools. We evaluated the 56 WRA questions by screening 21 known invasive and 14 known non-invasive ornamental plants. After statistically comparing the predictability of each question and the frequency the question could be answered for both invasive and non-invasive species, we eliminated questions that provided no predictive power, were irrelevant in our current model, or could not be answered reliably at a high enough percentage. We also combined many similar questions. The final 19 remaining PRE questions were further tested for accuracy using 56 additional known invasive plants and 36 known non-invasive ornamental species. The resulting evaluation demonstrated that when “needs further evaluation” classifications were not included, the accuracy of the model was 100% for both predicting invasiveness and non-invasiveness. When “needs further evaluation” classifications were included as either false positive or false negative, the model was still 93% accurate in predicting invasiveness and 97% accurate in predicting non-invasiveness, with an overall accuracy of 95%. We conclude that the PRE tool should not only provide growers with a method to accurately screen their current stock and potential new introductions, but also increase the probability of the tool being accepted for use by the industry as the basis for a nursery certification program. PMID:25803830

  11. EVALUATION TECHNIQUES AND TOOL DEVELOPMENT FOR FY 08 CMAQ RELEASE

    EPA Science Inventory

    In this task, research efforts are outlined that relate to the AMD Model Evaluation Program element and support CMAQ releases within the FY05-FY08 time period. Model evaluation serves dual purposes; evaluation is necessary to characterize the accuracy of model predictions, and e...

  12. Development and implementation of an independence rating scale and evaluation process for nursing orientation of new graduates.

    PubMed

    Durkin, Gregory J

    2010-01-01

    A wide variety of evaluation formats are available for new graduate nurses, but most of them are single-point evaluation tools that do not provide a clear picture of progress for orientee or educator. This article describes the development of a Web-based evaluation tool that combines learning taxonomies with the Synergy model into a rating scale based on independent performance. The evaluation tool and process provides open 24/7 access to evaluation documentation for members of the orientation team, demystifying the process and clarifying expectations. The implementation of the tool has proven to be transformative in the perceptions of evaluation and performance expectations of new graduates. This tool has been successful at monitoring progress, altering education, and opening dialogue about performance for over 125 new graduate nurses since inception.

  13. Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2012-01-01

    This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.

  14. Evaluating the integration of cultural competence skills into health and physical assessment tools: a survey of Canadian schools of nursing.

    PubMed

    Chircop, Andrea; Edgecombe, Nancy; Hayward, Kathryn; Ducey-Gilbert, Cherie; Sheppard-Lemoine, Debbie

    2013-04-01

    Currently used audiovisual (AV) teaching tools to teach health and physical assessment reflect a Eurocentric bias using the biomedical model. The purpose of our study was to (a) identify commonly used AV teaching tools of Canadian schools of nursing and (b) evaluate the identified tools. A two-part descriptive quantitative method design was used. First, we surveyed schools of nursing across Canada. Second, the identified AV teaching tools were evaluated for content and modeling of cultural competence. The majority of the schools (67%) used publisher-produced videos associated with a physical assessment textbook. Major findings included minimal demonstration of negotiation with a client around cultural aspects of the interview including the need for an interpreter, modesty, and inclusion of support persons. Identification of culturally specific examples given during the videos was superficial and did not provide students with a comprehensive understanding of necessary culturally competent skills.

  15. The Use of AMET & Automated Scripts for Model Evaluation

    EPA Science Inventory

    Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.

  16. ATAMM enhancement and multiprocessor performance evaluation

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.; Som, Sukhamoy; Obando, Rodrigo; Malekpour, Mahyar R.; Jones, Robert L., III; Mandala, Brij Mohan V.

    1991-01-01

    ATAMM (Algorithm To Architecture Mapping Model) enhancement and multiprocessor performance evaluation is discussed. The following topics are included: the ATAMM model; ATAMM enhancement; ADM (Advanced Development Model) implementation of ATAMM; and ATAMM support tools.

  17. Comparison of in silico models for prediction of mutagenicity.

    PubMed

    Bakhtyari, Nazanin G; Raitano, Giuseppa; Benfenati, Emilio; Martin, Todd; Young, Douglas

    2013-01-01

    Using a dataset with more than 6000 compounds, the performance of eight quantitative structure activity relationships (QSAR) models was evaluated: ACD/Tox Suite, Absorption, Distribution, Metabolism, Elimination, and Toxicity of chemical substances (ADMET) predictor, Derek, Toxicity Estimation Software Tool (T.E.S.T.), TOxicity Prediction by Komputer Assisted Technology (TOPKAT), Toxtree, CEASAR, and SARpy (SAR in python). In general, the results showed a high level of performance. To have a realistic estimate of the predictive ability, the results for chemicals inside and outside the training set for each model were considered. The effect of applicability domain tools (when available) on the prediction accuracy was also evaluated. The predictive tools included QSAR models, knowledge-based systems, and a combination of both methods. Models based on statistical QSAR methods gave better results.

  18. ThinTool: a spreadsheet model to evaluate fuel reduction thinning cost, net energy output, and nutrient impacts

    Treesearch

    Sang-Kyun Han; Han-Sup Han; William J. Elliot; Edward M. Bilek

    2017-01-01

    We developed a spreadsheet-based model, named ThinTool, to evaluate the cost of mechanical fuel reduction thinning including biomass removal, to predict net energy output, and to assess nutrient impacts from thinning treatments in northern California and southern Oregon. A combination of literature reviews, field-based studies, and contractor surveys was used to...

  19. Energy evaluation of protection effectiveness of anti-vibration gloves.

    PubMed

    Hermann, Tomasz; Dobry, Marian Witalis

    2017-09-01

    This article describes an energy method of assessing protection effectiveness of anti-vibration gloves on the human dynamic structure. The study uses dynamic models of the human and the glove specified in Standard No. ISO 10068:2012. The physical models of human-tool systems were developed by combining human physical models with a power tool model. The combined human-tool models were then transformed into mathematical models from which energy models were finally derived. Comparative energy analysis was conducted in the domain of rms powers. The energy models of the human-tool systems were solved using numerical simulation implemented in the MATLAB/Simulink environment. The simulation procedure demonstrated the effectiveness of the anti-vibration glove as a method of protecting human operators of hand-held power tools against vibration. The desirable effect is achieved by lowering the flow of energy in the human-tool system when the anti-vibration glove is employed.

  20. An interdisciplinary framework for participatory modeling design and evaluation—What makes models effective participatory decision tools?

    NASA Astrophysics Data System (ADS)

    Falconi, Stefanie M.; Palmer, Richard N.

    2017-02-01

    Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.

  1. Evaluation and Verification of Decadal Predictions using the MiKlip Central Evaluation System - a Case Study using the MiKlip Prototype Model Data

    NASA Astrophysics Data System (ADS)

    Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich

    2016-04-01

    MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.

  2. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures.

    PubMed

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.

  3. Challenges in Reproducibility, Replicability, and Comparability of Computational Models and Tools for Neuronal and Glial Networks, Cells, and Subcellular Structures

    PubMed Central

    Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena

    2018-01-01

    The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315

  4. Formal implementation of a performance evaluation model for the face recognition system.

    PubMed

    Shin, Yong-Nyuo; Kim, Jason; Lee, Yong-Jun; Shin, Woochang; Choi, Jin-Young

    2008-01-01

    Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  5. Current State and Model for Development of Technology-Based Care for Attention Deficit Hyperactivity Disorder.

    PubMed

    Benyakorn, Songpoom; Riley, Steven J; Calub, Catrina A; Schweitzer, Julie B

    2016-09-01

    Care (i.e., evaluation and intervention) delivered through technology is used in many areas of mental health services, including for persons with attention deficit hyperactivity disorder (ADHD). Technology can facilitate care for individuals with ADHD, their parents, and their care providers. The adoption of technological tools for ADHD care requires evidence-based studies to support the transition from development to integration into use in the home, school, or work for persons with the disorder. The initial phase, which is development of technological tools, has begun in earnest; however, the evidence base for many of these tools is lacking. In some instances, the uptake of a piece of technology into home use or clinical practice may be further along than the research to support its use. In this study, we review the current evidence regarding technology for ADHD and also propose a model to evaluate the support for other tools that have yet to be tested. We propose using the Research Domain Criteria as a framework for evaluating the tools' relationships to dimensions related to ADHD. This article concludes with recommendations for testing new tools that may have promise in improving the evaluation or treatment of persons with ADHD.

  6. Scratch as a Computational Modelling Tool for Teaching Physics

    ERIC Educational Resources Information Center

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  7. A simulation study to quantify the impacts of exposure ...

    EPA Pesticide Factsheets

    A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  8. THE ATMOSPHERIC MODEL EVALUATION (AMET): METEOROLOGY MODULE

    EPA Science Inventory

    An Atmospheric Model Evaluation Tool (AMET), composed of meteorological and air quality components, is being developed to examine the error and uncertainty in the model simulations. AMET matches observations with the corresponding model-estimated values in space and time, and the...

  9. A Tool for the Automated Design and Evaluation of Habitat Interior Layouts

    NASA Technical Reports Server (NTRS)

    Simon, Matthew A.; Wilhite, Alan W.

    2013-01-01

    The objective of space habitat design is to minimize mass and system size while providing adequate space for all necessary equipment and a functional layout that supports crew health and productivity. Unfortunately, development and evaluation of interior layouts is often ignored during conceptual design because of the subjectivity and long times required using current evaluation methods (e.g., human-in-the-loop mockup tests and in-depth CAD evaluations). Early, more objective assessment could prevent expensive design changes that may increase vehicle mass and compromise functionality. This paper describes a new interior design evaluation method to enable early, structured consideration of habitat interior layouts. This interior layout evaluation method features a comprehensive list of quantifiable habitat layout evaluation criteria, automatic methods to measure these criteria from a geometry model, and application of systems engineering tools and numerical methods to construct a multi-objective value function measuring the overall habitat layout performance. In addition to a detailed description of this method, a C++/OpenGL software tool which has been developed to implement this method is also discussed. This tool leverages geometry modeling coupled with collision detection techniques to identify favorable layouts subject to multiple constraints and objectives (e.g., minimize mass, maximize contiguous habitable volume, maximize task performance, and minimize crew safety risks). Finally, a few habitat layout evaluation examples are described to demonstrate the effectiveness of this method and tool to influence habitat design.

  10. High Throughput PBPK: Evaluating EPA's Open-Source Data and Tools for Dosimetry and Exposure Reconstruction (SOT)

    EPA Science Inventory

    To address this need, new tools have been created for characterizing, simulating, and evaluating chemical biokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissu...

  11. Application of digital human modeling and simulation for vision analysis of pilots in a jet aircraft: a case study.

    PubMed

    Karmakar, Sougata; Pal, Madhu Sudan; Majumdar, Deepti; Majumdar, Dhurjati

    2012-01-01

    Ergonomic evaluation of visual demands becomes crucial for the operators/users when rapid decision making is needed under extreme time constraint like navigation task of jet aircraft. Research reported here comprises ergonomic evaluation of pilot's vision in a jet aircraft in virtual environment to demonstrate how vision analysis tools of digital human modeling software can be used effectively for such study. Three (03) dynamic digital pilot models, representative of smallest, average and largest Indian pilot population were generated from anthropometric database and interfaced with digital prototype of the cockpit in Jack software for analysis of vision within and outside the cockpit. Vision analysis tools like view cones, eye view windows, blind spot area, obscuration zone, reflection zone etc. were employed during evaluation of visual fields. Vision analysis tool was also used for studying kinematic changes of pilot's body joints during simulated gazing activity. From present study, it can be concluded that vision analysis tool of digital human modeling software was found very effective in evaluation of position and alignment of different displays and controls in the workstation based upon their priorities within the visual fields and anthropometry of the targeted users, long before the development of its physical prototype.

  12. On-line Tools for Assessing Petroleum Releases

    EPA Science Inventory

    The Internet tools described in this report provide methods and models for evaluation of contaminated sites. Two problems are addressed by models. The first is the placement of wells for correct delineation of contaminant plumes. Because aquifer recharge can displace plumes dow...

  13. TEMPORALLY-RESOLVED AMMONIA EMISSION INVENTORIES: CURRENT ESTIMATES, EVALUATION TOOLS, AND MEASUREMENT NEEDS

    EPA Science Inventory

    In this study, we evaluate the suitability of a three-dimensional chemical transport model (CTM) as a tool for assessing ammonia emission inventories, calculate the improvement in CTM performance owing to recent advances in temporally-varying ammonia emission estimates, and ident...

  14. Reliability, Validity, and Factor Structure of the Current Assessment Practice Evaluation-Revised (CAPER) in a National Sample.

    PubMed

    Lyon, Aaron R; Pullmann, Michael D; Dorsey, Shannon; Martin, Prerna; Grigore, Alexandra A; Becker, Emily M; Jensen-Doss, Amanda

    2018-05-11

    Measurement-based care (MBC) is an increasingly popular, evidence-based practice, but there are no tools with established psychometrics to evaluate clinician use of MBC practices in mental health service delivery. The current study evaluated the reliability, validity, and factor structure of scores generated from a brief, standardized tool to measure MBC practices, the Current Assessment Practice Evaluation-Revised (CAPER). Survey data from a national sample of 479 mental health clinicians were used to conduct exploratory and confirmatory factor analyses, as well as reliability and validity analyses (e.g., relationships between CAPER subscales and clinician MBC attitudes). Analyses revealed competing two- and three-factor models. Regardless of the model used, scores from CAPER subscales demonstrated good reliability and convergent and divergent validity with MBC attitudes in the expected directions. The CAPER appears to be a psychometrically sound tool for assessing clinician MBC practices. Future directions for development and application of the tool are discussed.

  15. Evaluation of reliability modeling tools for advanced fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Baker, Robert; Scheper, Charlotte

    1986-01-01

    The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.

  16. A Data-Driven Framework for Incorporating New Tools for ...

    EPA Pesticide Factsheets

    This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It outlined knowledge gained from the data provided using these high-throughput tools to assess chemical bioactivity and to predict chemical exposures and also identified future needs. It provided an opportunity to showcase ongoing research efforts within the National Exposure Research Laboratory and the National Center for Computational Toxicology within the Office of Research and Development to an international audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  17. California Geriatric Education Center Logic Model: An Evaluation and Communication Tool

    ERIC Educational Resources Information Center

    Price, Rachel M.; Alkema, Gretchen E.; Frank, Janet C.

    2009-01-01

    A logic model is a communications tool that graphically represents a program's resources, activities, priority target audiences for change, and the anticipated outcomes. This article describes the logic model development process undertaken by the California Geriatric Education Center in spring 2008. The CGEC is one of 48 Geriatric Education…

  18. Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model: A Web-based program designed to evaluate the cost-effectiveness of disease management programs in heart failure.

    PubMed

    Reed, Shelby D; Neilson, Matthew P; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H; Polsky, Daniel E; Graham, Felicia L; Bowers, Margaret T; Paul, Sara C; Granger, Bradi B; Schulman, Kevin A; Whellan, David J; Riegel, Barbara; Levy, Wayne C

    2015-11-01

    Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics; use of evidence-based medications; and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model. Projections of resource use and quality of life are modeled using relationships with time-varying Seattle Heart Failure Model scores. The model can be used to evaluate parallel-group and single-cohort study designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. The Tools for Economic Analysis of Patient Management Interventions in Heart Failure Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Beyond the buildingcentric approach: A vision for an integrated evaluation of sustainable buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conte, Emilia, E-mail: conte@poliba.it; Monno, Valeria, E-mail: vmonno@poliba.it

    2012-04-15

    The available sustainable building evaluation systems have produced a new environmental design paradigm. However, there is an increasing need to overcome the buildingcentric approach of these systems, in order to further exploit their innovate potential for sustainable building practices. The paper takes this challenge by developing a cross-scale evaluation approach focusing on the reliability of sustainable building design solutions for the context in which the building is situated. An integrated building-urban evaluation model is proposed based on the urban matrix, which is a conceptualisation of the built environment as a social-ecological system. The model aims at evaluating the sustainability ofmore » a building considering it as an active entity contributing to the resilience of the urban matrix. Few holistic performance indicators are used for evaluating such contribution, so expressing the building reliability. The discussion on the efficacy of the model shows that it works as a heuristic tool, supporting the acquisition of a better insight into the complexity which characterises the relationships between the building and the built environment sustainability. Shading new lights on the meaning of sustainable buildings, the model can play a positive role in innovating sustainable building design practices, thus complementing current evaluation systems. - Highlights: Black-Right-Pointing-Pointer We model an integrated building-urban evaluation approach. Black-Right-Pointing-Pointer The urban matrix represents the social-ecological functioning of the urban context. Black-Right-Pointing-Pointer We introduce the concept of reliability to evaluate sustainable buildings. Black-Right-Pointing-Pointer Holistic indicators express the building reliability. Black-Right-Pointing-Pointer The evaluation model works as heuristic tool and complements other tools.« less

  20. Development of a Design Tool for Planning Aqueous Amendment Injection Systems

    DTIC Science & Technology

    2012-08-01

    Chemical Oxidation with Permanganate (MnO4- ) ...................................... 2 1.4 IMPLEMENTATION ISSUES...17 6.4 SS DESIGN TOOL DEVELOPMENT AND EVALUATION ........................... 19 7.0 CHEMICAL OXIDATION WITH PERMANGANATE ...21 7.1 NUMERICAL MODELING OF PERMANGANATE DISTRIBUTION ........... 21 7.2 CDISCO DEVELOPMENT AND EVALUATION

  1. Comparison of software tools for kinetic evaluation of chemical degradation data.

    PubMed

    Ranke, Johannes; Wöltjen, Janina; Meinecke, Stefan

    2018-01-01

    For evaluating the fate of xenobiotics in the environment, a variety of degradation or environmental metabolism experiments are routinely conducted. The data generated in such experiments are evaluated by optimizing the parameters of kinetic models in a way that the model simulation fits the data. No comparison of the main software tools currently in use has been published to date. This article shows a comparison of numerical results as well as an overall, somewhat subjective comparison based on a scoring system using a set of criteria. The scoring was separately performed for two types of uses. Uses of type I are routine evaluations involving standard kinetic models and up to three metabolites in a single compartment. Evaluations involving non-standard model components, more than three metabolites or more than a single compartment belong to use type II. For use type I, usability is most important, while the flexibility of the model definition is most important for use type II. Test datasets were assembled that can be used to compare the numerical results for different software tools. These datasets can also be used to ensure that no unintended or erroneous behaviour is introduced in newer versions. In the comparison of numerical results, good agreement between the parameter estimates was observed for datasets with up to three metabolites. For the now unmaintained reference software DegKinManager/ModelMaker, and for OpenModel which is still under development, user options were identified that should be taken care of in order to obtain results that are as reliable as possible. Based on the scoring system mentioned above, the software tools gmkin, KinGUII and CAKE received the best scores for use type I. Out of the 15 software packages compared with respect to use type II, again gmkin and KinGUII were the first two, followed by the script based tool mkin, which is the technical basis for gmkin, and by OpenModel. Based on the evaluation using the system of criteria mentioned above and the comparison of numerical results for the suite of test datasets, the software tools gmkin, KinGUII and CAKE are recommended for use type I, and gmkin and KinGUII for use type II. For users that prefer to work with scripts instead of graphical user interfaces, mkin is recommended. For future software evaluations, it is recommended to include a measure for the total time that a typical user needs for a kinetic evaluation into the scoring scheme. It is the hope of the authors that the publication of test data, source code and overall rankings foster the evolution of useful and reliable software in the field.

  2. Relating MBSE to Spacecraft Development: A NASA Pathfinder

    NASA Technical Reports Server (NTRS)

    Othon, Bill

    2016-01-01

    The NASA Engineering and Safety Center (NESC) has sponsored a Pathfinder Study to investigate how Model Based Systems Engineering (MBSE) and Model Based Engineering (MBE) techniques can be applied by NASA spacecraft development projects. The objectives of this Pathfinder Study included analyzing both the products of the modeling activity, as well as the process and tool chain through which the spacecraft design activities are executed. Several aspects of MBSE methodology and process were explored. Adoption and consistent use of the MBSE methodology within an existing development environment can be difficult. The Pathfinder Team evaluated the possibility that an "MBSE Template" could be developed as both a teaching tool as well as a baseline from which future NASA projects could leverage. Elements of this template include spacecraft system component libraries, data dictionaries and ontology specifications, as well as software services that do work on the models themselves. The Pathfinder Study also evaluated the tool chain aspects of development. Two chains were considered: 1. The Development tool chain, through which SysML model development was performed and controlled, and 2. The Analysis tool chain, through which both static and dynamic system analysis is performed. Of particular interest was the ability to exchange data between SysML and other engineering tools such as CAD and Dynamic Simulation tools. For this study, the team selected a Mars Lander vehicle as the element to be designed. The paper will discuss what system models were developed, how data was captured and exchanged, and what analyses were conducted.

  3. Evaluating the compatibility of multi-functional and intensive urban land uses

    NASA Astrophysics Data System (ADS)

    Taleai, M.; Sharifi, A.; Sliuzas, R.; Mesgari, M.

    2007-12-01

    This research is aimed at developing a model for assessing land use compatibility in densely built-up urban areas. In this process, a new model was developed through the combination of a suite of existing methods and tools: geographical information system, Delphi methods and spatial decision support tools: namely multi-criteria evaluation analysis, analytical hierarchy process and ordered weighted average method. The developed model has the potential to calculate land use compatibility in both horizontal and vertical directions. Furthermore, the compatibility between the use of each floor in a building and its neighboring land uses can be evaluated. The method was tested in a built-up urban area located in Tehran, the capital city of Iran. The results show that the model is robust in clarifying different levels of physical compatibility between neighboring land uses. This paper describes the various steps and processes of developing the proposed land use compatibility evaluation model (CEM).

  4. New tools for linking human and earth system models: The Toolbox for Human-Earth System Interaction & Scaling (THESIS)

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.; Kauffman, B.; Lawrence, P.

    2016-12-01

    Integrated analysis of questions regarding land, water, and energy resources often requires integration of models of different types. One type of integration is between human and earth system models, since both societal and physical processes influence these resources. For example, human processes such as changes in population, economic conditions, and policies govern the demand for land, water and energy, while the interactions of these resources with physical systems determine their availability and environmental consequences. We have begun to develop and use a toolkit for linking human and earth system models called the Toolbox for Human-Earth System Integration and Scaling (THESIS). THESIS consists of models and software tools to translate, scale, and synthesize information from and between human system models and earth system models (ESMs), with initial application to linking the NCAR integrated assessment model, iPETS, with the NCAR earth system model, CESM. Initial development is focused on urban areas and agriculture, sectors that are both explicitly represented in both CESM and iPETS. Tools are being made available to the community as they are completed (see https://www2.cgd.ucar.edu/sections/tss/iam/THESIS_tools). We discuss four general types of functions that THESIS tools serve (Spatial Distribution, Spatial Properties, Consistency, and Outcome Evaluation). Tools are designed to be modular and can be combined in order to carry out more complex analyses. We illustrate their application to both the exposure of population to climate extremes and to the evaluation of climate impacts on the agriculture sector. For example, projecting exposure to climate extremes involves use of THESIS tools for spatial population, spatial urban land cover, the characteristics of both, and a tool to bring urban climate information together with spatial population information. Development of THESIS tools is continuing and open to the research community.

  5. Validation and Use of a Predictive Modeling Tool: Employing Scientific Findings to Improve Responsible Conduct of Research Education.

    PubMed

    Mulhearn, Tyler J; Watts, Logan L; Todd, E Michelle; Medeiros, Kelsey E; Connelly, Shane; Mumford, Michael D

    2017-01-01

    Although recent evidence suggests ethics education can be effective, the nature of specific training programs, and their effectiveness, varies considerably. Building on a recent path modeling effort, the present study developed and validated a predictive modeling tool for responsible conduct of research education. The predictive modeling tool allows users to enter ratings in relation to a given ethics training program and receive instantaneous evaluative information for course refinement. Validation work suggests the tool's predicted outcomes correlate strongly (r = 0.46) with objective course outcomes. Implications for training program development and refinement are discussed.

  6. A Management Tool for Assessing Aquaculture Environmental Impacts in Chilean Patagonian Fjords: Integrating Hydrodynamic and Pellets Dispersion Models

    NASA Astrophysics Data System (ADS)

    Tironi, Antonio; Marin, Víctor H.; Campuzano, Francisco J.

    2010-05-01

    This article introduces a management tool for salmon farming, with a scope in the local sustainability of salmon aquaculture of the Aysen Fjord, Chilean Patagonia. Based on Integrated Coastal Zone Management (ICZM) principles, the tool combines a large 3-level nested hydrodynamic model, a particle tracking module and a GIS application into an assessment tool for particulate waste dispersal of salmon farming activities. The model offers an open source alternative to particulate waste modeling and evaluation, contributing with valuable information for local decision makers in the process of locating new facilities and monitoring stations.

  7. Surface roughness model based on force sensors for the prediction of the tool wear.

    PubMed

    de Agustina, Beatriz; Rubio, Eva María; Sebastián, Miguel Ángel

    2014-04-04

    In this study, a methodology has been developed with the objective of evaluating the surface roughness obtained during turning processes by measuring the signals detected by a force sensor under the same cutting conditions. In this way, the surface quality achieved along the process is correlated to several parameters of the cutting forces (thrust forces, feed forces and cutting forces), so the effect that the tool wear causes on the surface roughness is evaluated. In a first step, the best cutting conditions (cutting parameters and radius of tool) for a certain quality surface requirement were found for pieces of UNS A97075. Next, with this selection a model of surface roughness based on the cutting forces was developed for different states of wear that simulate the behaviour of the tool throughout its life. The validation of this model reveals that it was effective for approximately 70% of the surface roughness values obtained.

  8. Current State and Model for Development of Technology-Based Care for Attention Deficit Hyperactivity Disorder

    PubMed Central

    Riley, Steven J.; Calub, Catrina A.; Schweitzer, Julie B.

    2016-01-01

    Abstract Introduction: Care (i.e., evaluation and intervention) delivered through technology is used in many areas of mental health services, including for persons with attention deficit hyperactivity disorder (ADHD). Technology can facilitate care for individuals with ADHD, their parents, and their care providers. The adoption of technological tools for ADHD care requires evidence-based studies to support the transition from development to integration into use in the home, school, or work for persons with the disorder. The initial phase, which is development of technological tools, has begun in earnest; however, the evidence base for many of these tools is lacking. In some instances, the uptake of a piece of technology into home use or clinical practice may be further along than the research to support its use. Methods: In this study, we review the current evidence regarding technology for ADHD and also propose a model to evaluate the support for other tools that have yet to be tested. Results: We propose using the Research Domain Criteria as a framework for evaluating the tools' relationships to dimensions related to ADHD. Conclusion: This article concludes with recommendations for testing new tools that may have promise in improving the evaluation or treatment of persons with ADHD. PMID:26985703

  9. Evaluation of Phosphorus Site Assessment Tools: Lessons from the USA.

    PubMed

    Sharpley, Andrew; Kleinman, Peter; Baffaut, Claire; Beegle, Doug; Bolster, Carl; Collick, Amy; Easton, Zachary; Lory, John; Nelson, Nathan; Osmond, Deanna; Radcliffe, David; Veith, Tamie; Weld, Jennifer

    2017-11-01

    Critical source area identification through phosphorus (P) site assessment is a fundamental part of modern nutrient management planning in the United States, yet there has been only sparse testing of the many versions of the P Index that now exist. Each P site assessment tool was developed to be applicable across a range of field conditions found in a given geographic area, making evaluation extremely difficult. In general, evaluation with in-field monitoring data has been limited, focusing primarily on corroborating manure and fertilizer "source" factors. Thus, a multiregional effort (Chesapeake Bay, Heartland, and Southern States) was undertaken to evaluate P Indices using a combination of limited field data, as well as output from simulation models (i.e., Agricultural Policy Environmental eXtender, Annual P Loss Estimator, Soil and Water Assessment Tool [SWAT], and Texas Best Management Practice Evaluation Tool [TBET]) to compare against P Index ratings. These comparisons show promise for advancing the weighting and formulation of qualitative P Index components but require careful vetting of the simulation models. Differences among regional conclusions highlight model strengths and weaknesses. For example, the Southern States region found that, although models could simulate the effects of nutrient management on P runoff, they often more accurately predicted hydrology than total P loads. Furthermore, SWAT and TBET overpredicted particulate P and underpredicted dissolved P, resulting in correct total P predictions but for the wrong reasons. Experience in the United States supports expanded regional approaches to P site assessment, assuming closely coordinated efforts that engage science, policy, and implementation communities, but limited scientific validity exists for uniform national P site assessment tools at the present time. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  10. Cockpit System Situational Awareness Modeling Tool

    NASA Technical Reports Server (NTRS)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  11. System Engineering Concept Demonstration, Effort Summary. Volume 1

    DTIC Science & Technology

    1992-12-01

    involve only the system software, user frameworks and user tools. U •User Tool....s , Catalyst oExternal 00 Computer Framwork P OSystems • •~ Sysytem...analysis, synthesis, optimization, conceptual design of Catalyst. The paper discusses the definition, design, test, and evaluation; operational concept...This approach will allow system engineering The conceptual requirements for the Process Model practitioners to recognize and tailor the model. This

  12. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    PubMed

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). © 2013 Elsevier B.V. All rights reserved.

  13. System analysis tools for an ELT at ESO

    NASA Astrophysics Data System (ADS)

    Mueller, Michael; Koch, Franz

    2006-06-01

    Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.

  14. A YEAR-LONG MM5 EVALUATION USING A MODEL EVALUATION TOOLKIT

    EPA Science Inventory

    Air quality modeling has expanded in both sophistication and application over the past decade. Meteorological and air quality modeling tools are being used for research, forecasting, and regulatory related emission control strategies. Results from air quality simulations have far...

  15. WRF/CMAQ AQMEII3 Simulations of US Regional-Scale ...

    EPA Pesticide Factsheets

    Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary conditions prepared from four different global models. Results indicate that the impacts of different boundary conditions are significant for ozone throughout the year and most pronounced outside the summer season. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  16. Ottawa Model of Implementation Leadership and Implementation Leadership Scale: mapping concepts for developing and evaluating theory-based leadership interventions.

    PubMed

    Gifford, Wendy; Graham, Ian D; Ehrhart, Mark G; Davies, Barbara L; Aarons, Gregory A

    2017-01-01

    Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe), a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS), an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions. Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5) appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached. All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs. The O-MILe provides a theoretical basis for developing implementation leadership, and the ILS is a compatible tool for measuring leadership based on the O-MILe. Used together, the O-MILe and ILS provide an evidence- and theory-based approach for developing and measuring leadership for implementing evidence-based practices in health care. Template analysis offers a convenient approach for determining the compatibility of independently developed evaluation tools to test theoretical models.

  17. Ottawa Model of Implementation Leadership and Implementation Leadership Scale: mapping concepts for developing and evaluating theory-based leadership interventions

    PubMed Central

    Gifford, Wendy; Graham, Ian D; Ehrhart, Mark G; Davies, Barbara L; Aarons, Gregory A

    2017-01-01

    Purpose Leadership in health care is instrumental to creating a supportive organizational environment and positive staff attitudes for implementing evidence-based practices to improve patient care and outcomes. The purpose of this study is to demonstrate the alignment of the Ottawa Model of Implementation Leadership (O-MILe), a theoretical model for developing implementation leadership, with the Implementation Leadership Scale (ILS), an empirically validated tool for measuring implementation leadership. A secondary objective is to describe the methodological process for aligning concepts of a theoretical model with an independently established measurement tool for evaluating theory-based interventions. Methods Modified template analysis was conducted to deductively map items of the ILS onto concepts of the O-MILe. An iterative process was used in which the model and scale developers (n=5) appraised the relevance, conceptual clarity, and fit of each ILS items with the O-MILe concepts through individual feedback and group discussions until consensus was reached. Results All 12 items of the ILS correspond to at least one O-MILe concept, demonstrating compatibility of the ILS as a measurement tool for the O-MILe theoretical constructs. Conclusion The O-MILe provides a theoretical basis for developing implementation leadership, and the ILS is a compatible tool for measuring leadership based on the O-MILe. Used together, the O-MILe and ILS provide an evidence- and theory-based approach for developing and measuring leadership for implementing evidence-based practices in health care. Template analysis offers a convenient approach for determining the compatibility of independently developed evaluation tools to test theoretical models. PMID:29355212

  18. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    NASA Astrophysics Data System (ADS)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  19. Use of instruments to evaluate leadership in nursing and health services.

    PubMed

    Carrara, Gisleangela Lima Rodrigues; Bernardes, Andrea; Balsanelli, Alexandre Pazetto; Camelo, Silvia Helena Henriques; Gabriel, Carmen Silvia; Zanetti, Ariane Cristina Barboza

    2018-03-12

    To identify the available scientific evidence about the use of instruments for the evaluation of leadership in health and nursing services and verify the use of leadership styles/models/theories in the construction of these tools. Integrative literature review of indexed studies in the LILACS, PUBMED, CINAHL and EMBASE databases from 2006 to 2016. Thirty-eight articles were analyzed, exhibiting 19 leadership evaluation tools; the most used were the Multifactor Leadership Questionnaire, the Global Transformational Leadership Scale, the Leadership Practices Inventory, the Servant Leadership Questionnaire, the Servant Leadership Survey and the Authentic Leadership Questionnaire. The literature search allowed to identify the main theories/styles/models of contemporary leadership and analyze their use in the design of leadership evaluation tools, with the transformational, situational, servant and authentic leadership categories standing out as the most prominent. To a lesser extent, the quantum, charismatic and clinical leadership types were evidenced.

  20. Watershed Management Optimization Support Tool (WMOST) v2: Theoretical Documentation

    EPA Science Inventory

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...

  1. Modelling in Evaluating a Working Life Project in Higher Education

    ERIC Educational Resources Information Center

    Sarja, Anneli; Janhonen, Sirpa; Havukainen, Pirjo; Vesterinen, Anne

    2012-01-01

    This article describes an evaluation method based on collaboration between the higher education, a care home and university, in a R&D project. The aim of the project was to elaborate modelling as a tool of developmental evaluation for innovation and competence in project cooperation. The approach was based on activity theory. Modelling enabled a…

  2. Engine System Model Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; Simpson, Steven P.

    2006-01-01

    In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.

  3. Evaluation of the Langmuir model in the Soil and Water Assessment Tool for high soil phosphorus condition

    USDA-ARS?s Scientific Manuscript database

    Phosphorus adsorption by a water treatment residual was tested through Langmuir and linear sorption isotherms and applied in the Soil and Water Assessment Tool (SWAT). The objective of this study was to use laboratory and greenhouse experimental phosphorus data to evaluate the performance of a modi...

  4. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  5. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  6. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  7. Automatic Assessment of 3D Modeling Exams

    ERIC Educational Resources Information Center

    Sanna, A.; Lamberti, F.; Paravati, G.; Demartini, C.

    2012-01-01

    Computer-based assessment of exams provides teachers and students with two main benefits: fairness and effectiveness in the evaluation process. This paper proposes a fully automatic evaluation tool for the Graphic and Virtual Design (GVD) curriculum at the First School of Architecture of the Politecnico di Torino, Italy. In particular, the tool is…

  8. A Quality Model to Select Patients in Cupping Therapy Clinics: A New Tool for Ensuring Safety in Clinical Practice.

    PubMed

    Aboushanab, Tamer; AlSanad, Saud

    2018-06-08

    Cupping therapy is a popular treatment in various countries and regions, including Saudi Arabia. Cupping therapy is regulated in Saudi Arabia by the National Center for Complementary and Alternative Medicine (NCCAM), Ministry of Health. The authors recommend that this quality model for selecting patients in cupping clinics - first version (QMSPCC-1) - be used routinely as part of clinical practice and quality management in cupping clinics. The aim of the quality model is to ensure the safety of patients and to introduce and facilitate quality and auditing processes in cupping therapy clinics. Clinical evaluation of this tool is recommended. Continued development, re-evaluation and reassessment of this tool are important. Copyright © 2018. Published by Elsevier B.V.

  9. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    USGS Publications Warehouse

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  10. An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application

    USDA-ARS?s Scientific Manuscript database

    A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...

  11. Toward improved simulation of river operations through integration with a hydrologic model

    USGS Publications Warehouse

    Morway, Eric D.; Niswonger, Richard G.; Triana, Enrique

    2016-01-01

    Advanced modeling tools are needed for informed water resources planning and management. Two classes of modeling tools are often used to this end–(1) distributed-parameter hydrologic models for quantifying supply and (2) river-operation models for sorting out demands under rule-based systems such as the prior-appropriation doctrine. Within each of these two broad classes of models, there are many software tools that excel at simulating the processes specific to each discipline, but have historically over-simplified, or at worse completely neglected, aspects of the other. As a result, water managers reliant on river-operation models for administering water resources need improved tools for representing spatially and temporally varying groundwater resources in conjunctive-use systems. A new tool is described that improves the representation of groundwater/surface-water (GW-SW) interaction within a river-operations modeling context and, in so doing, advances evaluation of system-wide hydrologic consequences of new or altered management regimes.

  12. Freva - Freie Univ Evaluation System Framework for Scientific Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe

    2016-04-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  13. Modeling and Simulation Roadmap to Enhance Electrical Energy Security of U.S. Naval Bases

    DTIC Science & Technology

    2012-03-01

    evaluating power system architectures and technologies and, therefore, can become a valuable tool for the implementation of the described plan for Navy...a well validated and consistent process for evaluating power system architectures and technologies and, therefore, can be a valuable tool for the...process for evaluating power system architectures and component technologies is needed to support the development and implementation of these new

  14. Modeling and Tool Wear in Routing of CFRP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.

    2011-01-17

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the toolmore » wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.« less

  15. Industrial Waste Management Evaluation Model Version 3.1

    EPA Pesticide Factsheets

    IWEM is a screening level ground water model designed to simulate contaminant fate and transport. IWEM v3.1 is the latest version of the IWEM software, which includes additional tools to evaluate the beneficial use of industrial materials

  16. Logic Modeling as a Tool to Prepare to Evaluate Disaster and Emergency Preparedness, Response, and Recovery in Schools

    ERIC Educational Resources Information Center

    Zantal-Wiener, Kathy; Horwood, Thomas J.

    2010-01-01

    The authors propose a comprehensive evaluation framework to prepare for evaluating school emergency management programs. This framework involves a logic model that incorporates Government Performance and Results Act (GPRA) measures as a foundation for comprehensive evaluation that complements performance monitoring used by the U.S. Department of…

  17. Automation of Ocean Product Metrics

    DTIC Science & Technology

    2008-09-30

    Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data

  18. The DigiLit Framework

    ERIC Educational Resources Information Center

    Baxa, Julie; Christ, Tanya

    2018-01-01

    Selecting and integrating the use of digital texts/tools in literacy lessons are complex tasks. The DigiLit framework provides a succinct model to guide planning, reflection, coaching, and formative evaluation of teachers' successful digital text/tool selection and integration for literacy lessons. For digital text/tool selection, teachers need to…

  19. Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies

    EPA Science Inventory

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...

  20. WHAT ARE THE BEST MEANS TO ASSESS SITES AND MOVE TOWARD CLOSURE, USING APPROPRIATE SITE SPECIFIC RISK EVALUATIONS?

    EPA Science Inventory

    To facilitate evaluation of existing site characterization data, ORD has developed on-line tools and models that integrate data and models into innovative applications. Forty calculators have been developed in four groups: parameter estimators, models, scientific demos and unit ...

  1. Evaluation of a watershed model for estimating daily flow using limited flow measurements

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool (SWAT) model was evaluated for estimation of continuous daily flow based on limited flow measurements in the Upper Oyster Creek (UOC) watershed. SWAT was calibrated against limited measured flow data and then validated. The Nash-Sutcliffe model Efficiency (NSE) and...

  2. Modeling fate and transport of "Contaminants of Emerging Concern" (CECs): is the Soil Water Assessment Tool (SWAT) the appropriate model?

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods As the scientific and regulatory communities realize the significant environmental impacts and ubiquity of “contaminants of emerging concern” (CECs), it is increasingly imperative to develop quantitative assessment tools to evaluate and predict the fate and transport of...

  3. Rasch Model Based Analysis of the Force Concept Inventory

    ERIC Educational Resources Information Center

    Planinic, Maja; Ivanjek, Lana; Susac, Ana

    2010-01-01

    The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear…

  4. Fast analysis of radionuclide decay chain migration

    NASA Astrophysics Data System (ADS)

    Chen, J. S.; Liang, C. P.; Liu, C. W.; Li, L.

    2014-12-01

    A novel tool for rapidly predicting the long-term plume behavior of an arbitrary length radionuclide decay chain is presented in this study. This fast tool is achieved based on generalized analytical solutions in compact format derived for a set of two-dimensional advection-dispersion equations coupled with sequential first-order decay reactions in groundwater system. The performance of the developed tool is evaluated by a numerical model using a Laplace transform finite difference scheme. The results of performance evaluation indicate that the developed model is robust and accurate. The developed model is then used to fast understand the transport behavior of a four-member radionuclide decay chain. Results show that the plume extents and concentration levels of any target radionuclide are very sensitive to longitudinal, transverse dispersion, decay rate constant and retardation factor. The developed model are useful tools for rapidly assessing the ecological and environmental impact of the accidental radionuclide releases such as the Fukushima nuclear disaster where multiple radionuclides leaked through the reactor, subsequently contaminating the local groundwater and ocean seawater in the vicinity of the nuclear plant.

  5. Evaluation of Radiation Belt Space Weather Forecasts for Internal Charging Analyses

    NASA Technical Reports Server (NTRS)

    Minow, Joseph I.; Coffey, Victoria N.; Jun, Insoo; Garrett, Henry B.

    2007-01-01

    A variety of static electron radiation belt models, space weather prediction tools, and energetic electron datasets are used by spacecraft designers and operations support personnel as internal charging code inputs to evaluate electrostatic discharge risks in space systems due to exposure to relativistic electron environments. Evaluating the environment inputs is often accomplished by comparing whether the data set or forecast tool reliability predicts measured electron flux (or fluence over a given period) for some chosen period. While this technique is useful as a model metric, it does not provide the information necessary to evaluate whether short term deviances of the predicted flux is important in the charging evaluations. In this paper, we use a 1-D internal charging model to compute electric fields generated in insulating materials as a function of time when exposed to relativistic electrons in the Earth's magnetosphere. The resulting fields are assumed to represent the "true" electric fields and are compared with electric field values computed from relativistic electron environments derived from a variety of space environment and forecast tools. Deviances in predicted fields compared to the "true" fields which depend on insulator charging time constants will be evaluated as a potential metric for determining the importance of predicted and measured relativistic electron flux deviations over a range of time scales.

  6. Surface Roughness Model Based on Force Sensors for the Prediction of the Tool Wear

    PubMed Central

    de Agustina, Beatriz; Rubio, Eva María; Sebastián, Miguel Ángel

    2014-01-01

    In this study, a methodology has been developed with the objective of evaluating the surface roughness obtained during turning processes by measuring the signals detected by a force sensor under the same cutting conditions. In this way, the surface quality achieved along the process is correlated to several parameters of the cutting forces (thrust forces, feed forces and cutting forces), so the effect that the tool wear causes on the surface roughness is evaluated. In a first step, the best cutting conditions (cutting parameters and radius of tool) for a certain quality surface requirement were found for pieces of UNS A97075. Next, with this selection a model of surface roughness based on the cutting forces was developed for different states of wear that simulate the behaviour of the tool throughout its life. The validation of this model reveals that it was effective for approximately 70% of the surface roughness values obtained. PMID:24714391

  7. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  8. An Evaluation of Internet-Based CAD Collaboration Tools

    ERIC Educational Resources Information Center

    Smith, Shana Shiang-Fong

    2004-01-01

    Due to the now widespread use of the Internet, most companies now require computer aided design (CAD) tools that support distributed collaborative design on the Internet. Such CAD tools should enable designers to share product models, as well as related data, from geographically distant locations. However, integrated collaborative design…

  9. The School Personnel Management System. Manual 1--Tools. Manual 2--Models. Manual 3--Results.

    ERIC Educational Resources Information Center

    National School Boards Association, Washington, DC.

    The School Personnel Management System offers a correlated set of job descriptions, evaluative instruments, policies, tools, forms, and publications intended to aid local school officials in enhancing their personnel management programs. The materials are contained in two looseleaf binders entitled "Manual 1--Tools," and "Manual…

  10. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  11. A simulation model for risk assessment of turbine wheels

    NASA Astrophysics Data System (ADS)

    Safie, Fayssal M.; Hage, Richard T.

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  12. AgMIP Training in Multiple Crop Models and Tools

    NASA Technical Reports Server (NTRS)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  13. Development and application of air quality models at the US ...

    EPA Pesticide Factsheets

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  14. How to Assess the External Validity and Model Validity of Therapeutic Trials: A Conceptual Approach to Systematic Review Methodology

    PubMed Central

    2014-01-01

    Background. Evidence rankings do not consider equally internal (IV), external (EV), and model validity (MV) for clinical studies including complementary and alternative medicine/integrative medicine (CAM/IM) research. This paper describe this model and offers an EV assessment tool (EVAT©) for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IM research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making. PMID:24734111

  15. Program Model Checking: A Practitioner's Guide

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.

    2008-01-01

    Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.

  16. Impacts of Lateral Boundary Conditions on US Ozone ...

    EPA Pesticide Factsheets

    Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, we perform annual simulations over North America with chemical boundary conditions prepared from two global models (GEOS-CHEM and Hemispheric CMAQ). Results indicate that the impacts of different boundary conditions on ozone can be significant throughout the year. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  17. Simbol-X Formation Flight and Image Reconstruction

    NASA Astrophysics Data System (ADS)

    Civitani, M.; Djalal, S.; Le Duigou, J. M.; La Marle, O.; Chipaux, R.

    2009-05-01

    Simbol-X is the first operational mission relying on two satellites flying in formation. The dynamics of the telescope, due to the formation flight concept, raises a variety of problematic, like image reconstruction, that can be better evaluated via a simulation tools. We present here the first results obtained with Simulos, simulation tool aimed to study the relative spacecrafts navigation and the weight of the different parameters in image reconstruction and telescope performance evaluation. The simulation relies on attitude and formation flight sensors models, formation flight dynamics and control, mirror model and focal plane model, while the image reconstruction is based on the Line of Sight (LOS) concept.

  18. Development of a self-assessment tool for measuring competences of obstetric nurses in rooming-in wards in China

    PubMed Central

    Zhang, Ju; Ye, Wenqin; Fan, Fan

    2015-01-01

    Introduction: To provide high-quality nursing care, a reliable and feasible competency assessment tool is critical. Although several questionnaire-based competency assessment tools have been reported, a tool specific for obstetric nurses in rooming-in wards is lacking. Therefore, the purpose of this research is to develop a competency assessment tool for obstetric rooming-in ward nurses. Methods: A literature review was conducted to create an individual intensive interview with 14 nurse managers, educators, and primary nurses in rooming-in wards. Expert reviews (n = 15) were conducted to identify emergent themes in a Delphi fashion. A competency assessment questionnaire was then developed and tested with 246 rooming-in ward nurses in local hospitals. Results: We constructed a three-factor linear model for obstetric rooming-in nurse competency assessment. Further refinement resulted in a self-assessment questionnaire containing three first-tier, 12 second-tier, and 43 third-tier items for easy implementation. The questionnaire was reliable, contained satisfactory content, and had construct validity. Discussion: Our competency assessment tool provides a systematic, easy, and operational subjective evaluation model for nursing managers and administrators to evaluate obstetric rooming-in ward primary nurses. The application of this tool will facilitate various human resources functions, such as nurse training/education effect evaluation, and will eventually promote high-quality nursing care delivery. PMID:26770468

  19. METAPHOR: Programmer's guide, Version 1

    NASA Technical Reports Server (NTRS)

    Furchtgott, D. G.

    1979-01-01

    The internal structure of the Michigan Evaluation Aid for Perphormability (METAPHOR), an interactive software package to facilitate performability modeling and evaluation is described. Revised supplemented guides are prepared in order to maintain an up-to-date documentation of the system. Programmed tools to facilitate each step of performability model construction and model solution are given.

  20. Simulating forage crop production in a northern climate with the Integrated Farm System Model

    USDA-ARS?s Scientific Manuscript database

    Whole-farm simulation models are useful tools for evaluating the effect of management practices and climate variability on the agro-environmental and economic performance of farms. A few process-based farm-scale models have been developed, but none have been evaluated in a northern region with a sho...

  1. FIA BioSum: a tool to evaluate financial costs, opportunities and effectiveness of fuel treatments.

    Treesearch

    Jeremy Fried; Glenn Christensen

    2004-01-01

    FIA BioSum, a tool developed by the USDA Forest Services Forest Inventory and Analysis (FIA) Program, generates reliable cost estimates, identifies opportunities and evaluates the effectiveness of fuel treatments in forested landscapes. BioSum is an analytic framework that integrates a suite of widely used computer models with a foundation of attribute-rich,...

  2. Designing tools for oil exploration using nuclear modeling

    NASA Astrophysics Data System (ADS)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  3. Development of a new generation of high-resolution anatomical models for medical device evaluation: the Virtual Population 3.0

    NASA Astrophysics Data System (ADS)

    Gosselin, Marie-Christine; Neufeld, Esra; Moser, Heidi; Huber, Eveline; Farcito, Silvia; Gerber, Livia; Jedensjö, Maria; Hilber, Isabel; Di Gennaro, Fabienne; Lloyd, Bryn; Cherubini, Emilio; Szczerba, Dominik; Kainz, Wolfgang; Kuster, Niels

    2014-09-01

    The Virtual Family computational whole-body anatomical human models were originally developed for electromagnetic (EM) exposure evaluations, in particular to study how absorption of radiofrequency radiation from external sources depends on anatomy. However, the models immediately garnered much broader interest and are now applied by over 300 research groups, many from medical applications research fields. In a first step, the Virtual Family was expanded to the Virtual Population to provide considerably broader population coverage with the inclusion of models of both sexes ranging in age from 5 to 84 years old. Although these models have proven to be invaluable for EM dosimetry, it became evident that significantly enhanced models are needed for reliable effectiveness and safety evaluations of diagnostic and therapeutic applications, including medical implants safety. This paper describes the research and development performed to obtain anatomical models that meet the requirements necessary for medical implant safety assessment applications. These include implementation of quality control procedures, re-segmentation at higher resolution, more-consistent tissue assignments, enhanced surface processing and numerous anatomical refinements. Several tools were developed to enhance the functionality of the models, including discretization tools, posing tools to expand the posture space covered, and multiple morphing tools, e.g., to develop pathological models or variations of existing ones. A comprehensive tissue properties database was compiled to complement the library of models. The results are a set of anatomically independent, accurate, and detailed models with smooth, yet feature-rich and topologically conforming surfaces. The models are therefore suited for the creation of unstructured meshes, and the possible applications of the models are extended to a wider range of solvers and physics. The impact of these improvements is shown for the MRI exposure of an adult woman with an orthopedic spinal implant. Future developments include the functionalization of the models for specific physical and physiological modeling tasks.

  4. Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    Doyle, Monica; ONeil, Daniel A.; Christensen, Carissa B.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS) is a decision support tool designed to aid program managers and strategic planners in determining how to invest technology research and development dollars. It is an Excel-based modeling package that allows a user to build complex space architectures and evaluate the impact of various technology choices. ATLAS contains system models, cost and operations models, a campaign timeline and a centralized technology database. Technology data for all system models is drawn from a common database, the ATLAS Technology Tool Box (TTB). The TTB provides a comprehensive, architecture-independent technology database that is keyed to current and future timeframes.

  5. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    NASA Astrophysics Data System (ADS)

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-10-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school students. To evaluate usability, we analyzed students' task performance and task completion time as they worked on an activity with the repurposed modeling technology. In stage 1, we conducted remote testing of an early modeling prototype with urban middle school students (n = 84). In stages 2 and 3, we used screencasting software to record students' mouse and keyboard movements during collaborative think-alouds (n = 22) and conducted a qualitative analysis of their peer discussions. Taken together, the study findings revealed two kinds of usability issues that interfered with students' productive use of the tool: issues related to the use of data and information, and issues related to the use of the modeling technology. The study findings resulted in design improvements that led to stronger usability outcomes and higher task performance among students. In this paper, we describe our methods for usability testing, our research findings, and our design solutions for supporting students' use of the modeling technology and use of data. The paper concludes with implications for the design and study of modeling technologies for science learning.

  6. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    NASA Astrophysics Data System (ADS)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.

  7. Data Visualization Saliency Model: A Tool for Evaluating Abstract Data Visualizations

    DOE PAGES

    Matzen, Laura E.; Haass, Michael J.; Divis, Kristin M.; ...

    2017-08-29

    Evaluating the effectiveness of data visualizations is a challenging undertaking and often relies on one-off studies that test a visualization in the context of one specific task. Researchers across the fields of data science, visualization, and human-computer interaction are calling for foundational tools and principles that could be applied to assessing the effectiveness of data visualizations in a more rapid and generalizable manner. One possibility for such a tool is a model of visual saliency for data visualizations. Visual saliency models are typically based on the properties of the human visual cortex and predict which areas of a scene havemore » visual features (e.g. color, luminance, edges) that are likely to draw a viewer's attention. While these models can accurately predict where viewers will look in a natural scene, they typically do not perform well for abstract data visualizations. In this paper, we discuss the reasons for the poor performance of existing saliency models when applied to data visualizations. We introduce the Data Visualization Saliency (DVS) model, a saliency model tailored to address some of these weaknesses, and we test the performance of the DVS model and existing saliency models by comparing the saliency maps produced by the models to eye tracking data obtained from human viewers. In conclusion, we describe how modified saliency models could be used as general tools for assessing the effectiveness of visualizations, including the strengths and weaknesses of this approach.« less

  8. Data Visualization Saliency Model: A Tool for Evaluating Abstract Data Visualizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzen, Laura E.; Haass, Michael J.; Divis, Kristin M.

    Evaluating the effectiveness of data visualizations is a challenging undertaking and often relies on one-off studies that test a visualization in the context of one specific task. Researchers across the fields of data science, visualization, and human-computer interaction are calling for foundational tools and principles that could be applied to assessing the effectiveness of data visualizations in a more rapid and generalizable manner. One possibility for such a tool is a model of visual saliency for data visualizations. Visual saliency models are typically based on the properties of the human visual cortex and predict which areas of a scene havemore » visual features (e.g. color, luminance, edges) that are likely to draw a viewer's attention. While these models can accurately predict where viewers will look in a natural scene, they typically do not perform well for abstract data visualizations. In this paper, we discuss the reasons for the poor performance of existing saliency models when applied to data visualizations. We introduce the Data Visualization Saliency (DVS) model, a saliency model tailored to address some of these weaknesses, and we test the performance of the DVS model and existing saliency models by comparing the saliency maps produced by the models to eye tracking data obtained from human viewers. In conclusion, we describe how modified saliency models could be used as general tools for assessing the effectiveness of visualizations, including the strengths and weaknesses of this approach.« less

  9. An Introduction to Solar Decision-Making Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mow, Benjamin

    2017-09-12

    The National Renewable Energy Laboratory (NREL) offers a variety of models and analysis tools to help decision makers evaluate and make informed decisions about solar projects, policies, and programs. This fact sheet aims to help decision makers determine which NREL tool to use for a given solar project or policy question, depending on its scope.

  10. Chapter 13: Tools for analysis

    Treesearch

    William Elliot; Kevin Hyde; Lee MacDonald; James McKean

    2007-01-01

    This chapter presents a synthesis of current computer modeling tools that are, or could be, adopted for use in evaluating the cumulative watershed effects of fuel management. The chapter focuses on runoff, soil erosion and slope stability predictive tools. Readers should refer to chapters on soil erosion and stability for more detailed information on the physical...

  11. Assessment of tools for modeling aircraft noise in national parks

    DOT National Transportation Integrated Search

    2005-03-18

    The first objective of this study was to evaluate the series of model enhancements that were included in : INM as a result of the recommendations from the GCNP MVS. Specifically, there was a desire to : evaluate the performance of the latest versions...

  12. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario

    NASA Astrophysics Data System (ADS)

    Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  13. A Microsoft Project-Based Planning, Tracking, and Management Tool for the National Transonic Facility's Model Changeover Process

    NASA Technical Reports Server (NTRS)

    Vairo, Daniel M.

    1998-01-01

    The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.

  14. Space Weather Products at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Kuznetsova, M.; Pulkkinen, A.; Maddox, M.; Rastaetter, L.; Berrios, D.; MacNeice, P.

    2010-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of space weather forecasting tools. Owing to the pace of development in the science community, new model capabilities emerge frequently. Consequently, space weather products and tools involve not only increased validity, but often entirely new capabilities. This presentation will review the present state of space weather tools as well as point out emerging future capabilities.

  15. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  16. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-03-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.

  17. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies

    PubMed Central

    Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.

    2016-01-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405

  18. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).

  19. Pieces of the Puzzle: Tracking the Chemical Component of the ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the risk assessment conducted at the U.S. EPA, as well as some research examples related to the exposome concept. This presentation also provides the recommendation of using two organizational and predictive frameworks for tracking chemical components in the exposome. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  20. Screening tool to evaluate the vulnerability of down-gradient receptors to groundwater contaminants from uncapped landfills

    USGS Publications Warehouse

    Baker, Ronald J.; Reilly, Timothy J.; Lopez, Anthony R.; Romanok, Kristin M.; Wengrowski, Edward W

    2015-01-01

    A screening tool for quantifying levels of concern for contaminants detected in monitoring wells on or near landfills to down-gradient receptors (streams, wetlands and residential lots) was developed and evaluated. The tool uses Quick Domenico Multi-scenario (QDM), a spreadsheet implementation of Domenico-based solute transport, to estimate concentrations of contaminants reaching receptors under steady-state conditions from a constant-strength source. Unlike most other available Domenico-based model applications, QDM calculates the time for down-gradient contaminant concentrations to approach steady state and appropriate dispersivity values, and allows for up to fifty simulations on a single spreadsheet. Sensitivity of QDM solutions to critical model parameters was quantified. The screening tool uses QDM results to categorize landfills as having high, moderate and low levels of concern, based on contaminant concentrations reaching receptors relative to regulatory concentrations. The application of this tool was demonstrated by assessing levels of concern (as defined by the New Jersey Pinelands Commission) for thirty closed, uncapped landfills in the New Jersey Pinelands National Reserve, using historic water-quality data from monitoring wells on and near landfills and hydraulic parameters from regional flow models. Twelve of these landfills are categorized as having high levels of concern, indicating a need for further assessment. This tool is not a replacement for conventional numerically-based transport model or other available Domenico-based applications, but is suitable for quickly assessing the level of concern posed by a landfill or other contaminant point source before expensive and lengthy monitoring or remediation measures are taken. In addition to quantifying the level of concern using historic groundwater-monitoring data, the tool allows for archiving model scenarios and adding refinements as new data become available.

  1. Evolution of Natural Attenuation Evaluation Protocols

    EPA Science Inventory

    Traditionally the evaluation of the efficacy of natural attenuation was based on changes in contaminant concentrations and mass reduction. Statistical tools and models such as Bioscreen provided evaluation protocols which now are being approached via other vehicles including m...

  2. Validation of Tendril TrueHome Using Software-to-Software Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maguire, Jeffrey B; Horowitz, Scott G; Moore, Nathan

    This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.

  3. OPTIMIZING BMP PLACEMENT AT WATERSHED-SCALE USING SUSTAIN

    EPA Science Inventory

    Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...

  4. U.S. EPA's Watershed Management Research Activities

    EPA Science Inventory

    Watershed and stormwater managers need modeling tools to evaluate alternative plans for environmental quality restoration and protection needs in urban and developing areas. A watershed-scale decision-support system, based on cost optimization, provides an essential tool to suppo...

  5. The Virtual Learning Commons (VLC): Enabling Co-Innovation Across Disciplines

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gandara, A.; Del Rio, N.

    2014-12-01

    A key challenge for scientists addressing grand-challenge problems is identifying, understanding, and integrating potentially relevant methods, models and tools that that are rapidly evolving in the informatics community. Such tools are essential for effectively integrating data and models in complex research projects, yet it is often difficult to know what tools are available and it is not easy to understand or evaluate how they might be used in a given research context. The goal of the National Science Foundation-funded Virtual Learning Commons (VLC) is to improve awareness and understanding of emerging methodologies and technologies, facilitate individual and group evaluation of these, and trace the impact of innovations within and across teams, disciplines, and communities. The VLC is a Web-based social bookmarking site designed specifically to support knowledge exchange in research communities. It is founded on well-developed models of technology adoption, diffusion of innovation, and experiential learning. The VLC makes use of Web 2.0 (Social Web) and Web 3.0 (Semantic Web) approaches. Semantic Web approaches enable discovery of potentially relevant methods, models, and tools, while Social Web approaches enable collaborative learning about their function. The VLC is under development and the first release is expected Fall 2014.

  6. Investigating System Dependability Modeling Using AADL

    NASA Technical Reports Server (NTRS)

    Hall, Brendan; Driscoll, Kevin R.; Madl, Gabor

    2013-01-01

    This report describes Architecture Analysis & Design Language (AADL) models for a diverse set of fault-tolerant, embedded data networks and describes the methods and tools used to created these models. It also includes error models per the AADL Error Annex. Some networks were modeled using Error Detection Isolation Containment Types (EDICT). This report gives a brief description for each of the networks, a description of its modeling, the model itself, and evaluations of the tools used for creating the models. The methodology includes a naming convention that supports a systematic way to enumerate all of the potential failure modes.

  7. Evaluation of Intersection Traffic Control Measures through Simulation

    NASA Astrophysics Data System (ADS)

    Asaithambi, Gowri; Sivanandan, R.

    2015-12-01

    Modeling traffic flow is stochastic in nature due to randomness in variables such as vehicle arrivals and speeds. Due to this and due to complex vehicular interactions and their manoeuvres, it is extremely difficult to model the traffic flow through analytical methods. To study this type of complex traffic system and vehicle interactions, simulation is considered as an effective tool. Application of homogeneous traffic models to heterogeneous traffic may not be able to capture the complex manoeuvres and interactions in such flows. Hence, a microscopic simulation model for heterogeneous traffic is developed using object oriented concepts. This simulation model acts as a tool for evaluating various control measures at signalized intersections. The present study focuses on the evaluation of Right Turn Lane (RTL) and Channelised Left Turn Lane (CLTL). A sensitivity analysis was performed to evaluate RTL and CLTL by varying the approach volumes, turn proportions and turn lane lengths. RTL is found to be advantageous only up to certain approach volumes and right-turn proportions, beyond which it is counter-productive. CLTL is found to be advantageous for lower approach volumes for all turn proportions, signifying the benefits of CLTL. It is counter-productive for higher approach volume and lower turn proportions. This study pinpoints the break-even points for various scenarios. The developed simulation model can be used as an appropriate intersection lane control tool for enhancing the efficiency of flow at intersections. This model can also be employed for scenario analysis and can be valuable to field traffic engineers in implementing vehicle-type based and lane-based traffic control measures.

  8. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach

    PubMed Central

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-01-01

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design. PMID:28773800

  9. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.

    PubMed

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-08-09

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.

  10. Comparison of Data Development Tools for Populating Cognitive Models in Social Simulation

    DTIC Science & Technology

    2011-09-01

    world surveys. STANLEY was evaluated by scoring sentiment in a document corpus and attempting to correlate those scores to a real world issue ...corpus and attempting to correlate those scores to a real world issue . Results of the study indicate that the survey data tool generated case files of...15 1. Issues with the Initial Version of the Tool .......................................21 2. The Tool Used in the Research

  11. A universal Model-R Coupler to facilitate the use of R functions for model calibration and analysis

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Yan, Wende

    2014-01-01

    Mathematical models are useful in various fields of science and engineering. However, it is a challenge to make a model utilize the open and growing functions (e.g., model inversion) on the R platform due to the requirement of accessing and revising the model's source code. To overcome this barrier, we developed a universal tool that aims to convert a model developed in any computer language to an R function using the template and instruction concept of the Parameter ESTimation program (PEST) and the operational structure of the R-Soil and Water Assessment Tool (R-SWAT). The developed tool (Model-R Coupler) is promising because users of any model can connect an external algorithm (written in R) with their model to implement various model behavior analyses (e.g., parameter optimization, sensitivity and uncertainty analysis, performance evaluation, and visualization) without accessing or modifying the model's source code.

  12. A Regional Climate Model Evaluation System based on Satellite and other Observations

    NASA Astrophysics Data System (ADS)

    Lean, P.; Kim, J.; Waliser, D. E.; Hall, A. D.; Mattmann, C. A.; Granger, S. L.; Case, K.; Goodale, C.; Hart, A.; Zimdars, P.; Guan, B.; Molotch, N. P.; Kaki, S.

    2010-12-01

    Regional climate models are a fundamental tool needed for downscaling global climate simulations and projections, such as those contributing to the Coupled Model Intercomparison Projects (CMIPs) that form the basis of the IPCC Assessment Reports. The regional modeling process provides the means to accommodate higher resolution and a greater complexity of Earth System processes. Evaluation of both the global and regional climate models against observations is essential to identify model weaknesses and to direct future model development efforts focused on reducing the uncertainty associated with climate projections. However, the lack of reliable observational data and the lack of formal tools are among the serious limitations to addressing these objectives. Recent satellite observations are particularly useful as they provide a wealth of information on many different aspects of the climate system, but due to their large volume and the difficulties associated with accessing and using the data, these datasets have been generally underutilized in model evaluation studies. Recognizing this problem, NASA JPL / UCLA is developing a model evaluation system to help make satellite observations, in conjunction with in-situ, assimilated, and reanalysis datasets, more readily accessible to the modeling community. The system includes a central database to store multiple datasets in a common format and codes for calculating predefined statistical metrics to assess model performance. This allows the time taken to compare model simulations with satellite observations to be reduced from weeks to days. Early results from the use this new model evaluation system for evaluating regional climate simulations over California/western US regions will be presented.

  13. Evaluating the mitigation of greenhouse gas emissions and adaptation in dairy production.

    USDA-ARS?s Scientific Manuscript database

    Process-level modeling at the farm scale provides a tool for evaluating strategies for both mitigating greenhouse gas emissions and adapting to climate change. The Integrated Farm System Model (IFSM) simulates representative crop, beef or dairy farms over many years of weather to predict performance...

  14. Bayesian Posterior Odds Ratios: Statistical Tools for Collaborative Evaluations

    ERIC Educational Resources Information Center

    Hicks, Tyler; Rodríguez-Campos, Liliana; Choi, Jeong Hoon

    2018-01-01

    To begin statistical analysis, Bayesians quantify their confidence in modeling hypotheses with priors. A prior describes the probability of a certain modeling hypothesis apart from the data. Bayesians should be able to defend their choice of prior to a skeptical audience. Collaboration between evaluators and stakeholders could make their choices…

  15. Models, Measurements, and Local Decisions: Assessing and ...

    EPA Pesticide Factsheets

    This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include either exposure or emissions reduction, and a host of stakeholders, including residents, academics, NGOs, local and federal agencies. This presentation includes results from the C-PORT modeling system, and from a citizen science project from the local area. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  16. SUSTAIN - A BMP PROCESS AND PLACEMENT TOOL FOR URBAN WATERSHEDS

    EPA Science Inventory

    Watershed and stormwater managers need modeling tools to evaluate how best to address environmental quality restoration and protection needs in urban and developing areas. Significant investments are needed to protect and restore water quality, address total maximum daily loads (...

  17. SUSTAIN - A BMP PROCESS AND PLACEMENT TOOL FOR URBAN WATERSHEDS

    EPA Science Inventory

    Watershed and stormwater managers need modeling tools to evaluate how best to address environmental quality restoration and protection needs in urban and developing areas. Significant investments are needed to protect and restore water quality, address total maximum daily loads ...

  18. Evaluation of work zone enhancement software programs.

    DOT National Transportation Integrated Search

    2009-09-01

    The Missouri Department of Transportation (MoDOT) is looking for software tools that can assist in : developing effective plans to manage and communicate work zone activities. QuickZone, CA4PRS, : VISSIM, and Spreadsheet models are the tools that MoD...

  19. Planning Tool for Strategic Evaluation of Facility Plans - 13570

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magoulas, Virginia; Cercy, Michael; Hall, Irin

    2013-07-01

    Savannah River National Laboratory (SRNL) has developed a strategic planning tool for the evaluation of the utilization of its unique resources for processing and research and development of nuclear materials. The Planning Tool is a strategic level tool for assessing multiple missions that could be conducted utilizing the SRNL facilities and showcasing the plan. Traditional approaches using standard scheduling tools and laying out a strategy on paper tended to be labor intensive and offered either a limited or cluttered view for visualizing and communicating results. A tool that can assess the process throughput, duration, and utilization of the facility wasmore » needed. SRNL teamed with Newport News Shipbuilding (NNS), a division of Huntington Ingalls Industries, to create the next generation Planning Tool. The goal of this collaboration was to create a simulation based tool that allows for quick evaluation of strategies with respect to new or changing missions, and clearly communicates results to the decision makers. This tool has been built upon a mature modeling and simulation software previously developed by NNS. The Planning Tool provides a forum for capturing dependencies, constraints, activity flows, and variable factors. It is also a platform for quickly evaluating multiple mission scenarios, dynamically adding/updating scenarios, generating multiple views for evaluating/communicating results, and understanding where there are areas of risks and opportunities with respect to capacity. The Planning Tool that has been developed is useful in that it presents a clear visual plan for the missions at the Savannah River Site (SRS). It not only assists in communicating the plans to SRS corporate management, but also allows the area stakeholders a visual look at the future plans for SRS. The design of this tool makes it easily deployable to other facility and mission planning endeavors. (authors)« less

  20. Predicting tool life in turning operations using neural networks and image processing

    NASA Astrophysics Data System (ADS)

    Mikołajczyk, T.; Nowicki, K.; Bustillo, A.; Yu Pimenov, D.

    2018-05-01

    A two-step method is presented for the automatic prediction of tool life in turning operations. First, experimental data are collected for three cutting edges under the same constant processing conditions. In these experiments, the parameter of tool wear, VB, is measured with conventional methods and the same parameter is estimated using Neural Wear, a customized software package that combines flank wear image recognition and Artificial Neural Networks (ANNs). Second, an ANN model of tool life is trained with the data collected from the first two cutting edges and the subsequent model is evaluated on two different subsets for the third cutting edge: the first subset is obtained from the direct measurement of tool wear and the second is obtained from the Neural Wear software that estimates tool wear using edge images. Although the complete-automated solution, Neural Wear software for tool wear recognition plus the ANN model of tool life prediction, presented a slightly higher error than the direct measurements, it was within the same range and can meet all industrial requirements. These results confirm that the combination of image recognition software and ANN modelling could potentially be developed into a useful industrial tool for low-cost estimation of tool life in turning operations.

  1. Climate Model Diagnostic Analyzer

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei

    2015-01-01

    The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.

  2. A flexible tool for hydraulic and water quality performance analysis of green infrastructure

    NASA Astrophysics Data System (ADS)

    Massoudieh, A.; Alikhani, J.

    2017-12-01

    Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. To be used to evaluate the effect design configurations on the long-term performance of GIs, models should be able to consider processes within GIs with good fidelity. In this presentation, a sophisticated, yet flexible tool for hydraulic and water quality assessment of GIs will be introduced. The tool can be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media employed in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biogeochemical processes affecting contaminants such as evapotranspiration, plant uptake, reactions, and particle-associated transport accurately while maintaining a high degree of flexibility to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated. The process-based model framework developed here can be used to model a diverse range of GI practices such as stormwater ponds, green roofs, retention ponds, bioretention systems, infiltration trench, permeable pavement and other custom-designed combinatory systems. An example of the application of the system to evaluate the performance of a rain-garden system will be demonstrated.

  3. New tools for sculpting cranial implants in a shared haptic augmented reality environment.

    PubMed

    Ai, Zhuming; Evenhouse, Ray; Leigh, Jason; Charbel, Fady; Rasmussen, Mary

    2006-01-01

    New volumetric tools were developed for the design and fabrication of high quality cranial implants from patient CT data. These virtual tools replace time consuming physical sculpting, mold making and casting steps. The implant is designed by medical professionals in tele-immersive collaboration. Virtual clay is added in the virtual defect area on the CT data using the adding tool. With force feedback the modeler can feel the edge of the defect and fill only the space where no bone is present. A carving tool and a smoothing tool are then used to sculpt and refine the implant. To make a physical evaluation, the skull with simulated defect and the implant are fabricated via stereolithography to allow neurosurgeons to evaluate the quality of the implant. Initial tests demonstrate a very high quality fit. These new haptic volumetric sculpting tools are a critical component of a comprehensive tele-immersive system.

  4. The Magic Bullet: A Tool for Assessing and Evaluating Learning Potential in Games

    ERIC Educational Resources Information Center

    Becker, Katrin

    2011-01-01

    This paper outlines a simple and effective model that can be used to evaluate and design educational digital games. It also facilitates the formulation of strategies for using existing games in learning contexts. The model categorizes game goals and learning objectives into one or more of four possible categories. An overview of the model is…

  5. Dynamic Evaluation of Two Decades of CMAQ Simulations ...

    EPA Pesticide Factsheets

    This presentation focuses on the dynamic evaluation of the CMAQ model over the continental United States using multi-decadal simulations for the period from 1990 to 2010 to examine how well the changes in observed ozone air quality induced by variations in meteorology and/or emissions are simulated by the model. We applied spectral decomposition of the ozone time-series using the KZ filter to assess the variations in the strengths of synoptic (weather-induced variations) and baseline (long-term variation) forcings, embedded in the simulated and observed concentrations. The results reveal that CMAQ captured the year-to-year variability (more so in the later years than the earlier years) and the synoptic forcing in accordance with what the observations are showing. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  6. Modeling strength data for CREW CHIEF

    NASA Technical Reports Server (NTRS)

    Mcdaniel, Joe W.

    1990-01-01

    The Air Force has developed CREW CHIEF, a computer-aided design (CAD) tool for simulating and evaluating aircraft maintenance to determine if the required activities are feasible. CREW CHIEF gives the designer the ability to simulate maintenance activities with respect to reach, accessibility, strength, hand tool operation, and materials handling. While developing the CREW CHIEF, extensive research was performed to describe workers strength capabilities for using hand tools and manual handling of objects. More than 100,000 strength measures were collected and modeled for CREW CHIEF. These measures involved both male and female subjects in the 12 maintenance postures included in CREW CHIEF. The data collection and modeling effort are described.

  7. Development of task network models of human performance in microgravity

    NASA Technical Reports Server (NTRS)

    Diaz, Manuel F.; Adam, Susan

    1992-01-01

    This paper discusses the utility of task-network modeling for quantifying human performance variability in microgravity. The data are gathered for: (1) improving current methodologies for assessing human performance and workload in the operational space environment; (2) developing tools for assessing alternative system designs; and (3) developing an integrated set of methodologies for the evaluation of performance degradation during extended duration spaceflight. The evaluation entailed an analysis of the Remote Manipulator System payload-grapple task performed on many shuttle missions. Task-network modeling can be used as a tool for assessing and enhancing human performance in man-machine systems, particularly for modeling long-duration manned spaceflight. Task-network modeling can be directed toward improving system efficiency by increasing the understanding of basic capabilities of the human component in the system and the factors that influence these capabilities.

  8. Health impact assessment – A survey on quantifying tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org

    Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less

  9. A Novel Immunocompetent Mouse Model of Pancreatic Cancer with Robust Stroma: a Valuable Tool for Preclinical Evaluation of New Therapies

    PubMed Central

    Majumder, Kaustav; Arora, Nivedita; Modi, Shrey; Chugh, Rohit; Nomura, Alice; Giri, Bhuwan; Dawra, Rajinder; Ramakrishnan, Sundaram; Banerjee, Sulagna; Saluja, Ashok; Dudeja, Vikas

    2017-01-01

    A valid preclinical tumor model should recapitulate the tumor microenvironment. Immune and stromal components are absent in immunodeficient models of pancreatic cancer. While these components are present in genetically engineered models such as KrasG12D; Trp53R172H; Pdx-1Cre (KPC), immense variability in development of invasive disease makes them unsuitable for evaluation of novel therapies. We have generated a novel mouse model of pancreatic cancer by implanting tumor fragments from KPC mice into the pancreas of wild type mice. Three-millimeter tumor pieces from KPC mice were implanted into the pancreas of C57BL/6J mice. Four to eight weeks later, tumors were harvested, and stromal and immune components were evaluated. The efficacy of Minnelide, a novel compound which has been shown to be effective against pancreatic cancer in a number of preclinical murine models, was evaluated. In our model, consistent tumor growth and metastases were observed. Tumors demonstrated intense desmoplasia and leukocytic infiltration which was comparable to that in the genetically engineered KPC model and significantly more than that observed in KPC tumor-derived cell line implantation model. Minnelide treatment resulted in a significant decrease in the tumor weight and volume. This novel model demonstrates a consistent growth rate and tumor-associated mortality and recapitulates the tumor microenvironment. This convenient model is a valuable tool to evaluate novel therapies. PMID:26582596

  10. Bearing tester data compilation, analysis, and reporting and bearing math modeling

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The Shaberth bearing analysis computer program was developed for the analysis of jet engine shaft/bearing systems operating above room temperature with normal hydrocarbon lubricants. It is also possible to use this tool to evaluate the shaft bearing systems operating in cryogenics. Effects such as fluid drag, radial temperature gradients, outer race misalignments and clearance changes were simulated and evaluated. In addition, the speed and preload effects on bearing radial stiffness was evaluated. The Shaberth program was also used to provide contact stresses from which contact geometry was calculated to support other analyses such as the determination of cryogenic fluid film thickness in the contacts and evaluation of surface and subsurface stresses necessary for bearing failure evaluation. This program was a vital tool for the thermal analysis of the bearing in that it provides the heat generation rates at the rolling element/race contacts for input into a thermal model of the bearing/shaft assembly.

  11. Scenario Evaluator for Electrical Resistivity Survey Pre-modeling Tool

    EPA Science Inventory

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, su...

  12. SUSTAIN - A USEPA BMP PROCESS AND PLACEMENT TOOL FOR URBAN WATERSHEDS

    EPA Science Inventory

    Watershed and stormwater managers need modeling tools to evaluate how best to address environmental quality restoration and protection needs in urban and developing areas. Significant investments are needed to protect and restore water quality, address total maximum daily loads (...

  13. MysiRNA: improving siRNA efficacy prediction using a machine-learning model combining multi-tools and whole stacking energy (ΔG).

    PubMed

    Mysara, Mohamed; Elhefnawi, Mahmoud; Garibaldi, Jonathan M

    2012-06-01

    The investigation of small interfering RNA (siRNA) and its posttranscriptional gene-regulation has become an extremely important research topic, both for fundamental reasons and for potential longer-term therapeutic benefits. Several factors affect the functionality of siRNA including positional preferences, target accessibility and other thermodynamic features. State of the art tools aim to optimize the selection of target siRNAs by identifying those that may have high experimental inhibition. Such tools implement artificial neural network models as Biopredsi and ThermoComposition21, and linear regression models as DSIR, i-Score and Scales, among others. However, all these models have limitations in performance. In this work, a neural-network trained new siRNA scoring/efficacy prediction model was developed based on combining two existing scoring algorithms (ThermoComposition21 and i-Score), together with the whole stacking energy (ΔG), in a multi-layer artificial neural network. These three parameters were chosen after a comparative combinatorial study between five well known tools. Our developed model, 'MysiRNA' was trained on 2431 siRNA records and tested using three further datasets. MysiRNA was compared with 11 alternative existing scoring tools in an evaluation study to assess the predicted and experimental siRNA efficiency where it achieved the highest performance both in terms of correlation coefficient (R(2)=0.600) and receiver operating characteristics analysis (AUC=0.808), improving the prediction accuracy by up to 18% with respect to sensitivity and specificity of the best available tools. MysiRNA is a novel, freely accessible model capable of predicting siRNA inhibition efficiency with improved specificity and sensitivity. This multiclassifier approach could help improve the performance of prediction in several bioinformatics areas. MysiRNA model, part of MysiRNA-Designer package [1], is expected to play a key role in siRNA selection and evaluation. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Decision models in the evaluation of psychotropic drugs : useful tool or useless toy?

    PubMed

    Barbui, Corrado; Lintas, Camilla

    2006-09-01

    A current contribution in the European Journal of Health Economics employs a decision model to compare health care costs of olanzapine and risperidone treatment for schizophrenia. The model suggests that a treatment strategy of first-line olanzapine is cost-saving over a 1-year period, with additional clinical benefits in the form of avoided relapses in the long-term. From a clinical perspective this finding is indubitably relevant, but can physicians and policy makers believe it? The study is presented in a balanced way, assumptions are based on data extracted from clinical trials published in major psychiatric journals, and the theoretical underpinnings of the model are reasonable. Despite these positive aspects, we believe that the methodology used in this study-the decision model approach-is an unsuitable and potentially misleading tool for evaluating psychotropic drugs. In this commentary, taking the olanzapine vs. risperidone model as an example, arguments are provided to support this statement.

  15. Evaluating Opportunities to Improve Material and Energy Impacts in Commodity Supply Chains.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanes, Rebecca J.; Carpenter, Alberta

    When evaluated at the process level, next-generation technologies may be more energy and emissions intensive than current technology. However, many advanced technologies have the potential to reduce material and energy consumption in upstream or downstream processing stages. In order to fully understand the benefits and consequences of technology deployment, next-generation technologies should be evaluated in context, as part of a supply chain. This work presents the Material Flows through Industry (MFI) scenario modeling tool. The MFI tool is a cradle-to-gate linear network model of the U.S. industrial sector that can model a wide range of manufacturing scenarios, including changes inmore » production technology, increases in industrial energy efficiency, and substitution between functionally equivalent materials. The MFI tool was developed to perform supply chain scale analyses in order to quantify the impacts and benefits of next-generation technologies and materials at that scale. For the analysis presented in this paper, the MFI tool is utilized to explore a case study comparing a steel supply chain to the supply chains of several functionally equivalent materials. Several of the alternatives to the baseline steel supply chain include next-generation production technologies and materials. Results of the case study show that aluminum production scenarios can out-perform the steel supply chain by using either an advanced smelting technology or an increased aluminum recycling rate. The next-generation material supply chains do not perform as well as either aluminum or steel, but may offer additional use phase reductions in energy and emissions that are outside the scope of the MFI tool. Future work will combine results from the MFI tool with a use phase analysis.« less

  16. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  17. A risk assessment tool applied to the study of shale gas resources.

    PubMed

    Veiguela, Miguel; Hurtado, Antonio; Eguilior, Sonsoles; Recreo, Fernando; Roqueñi, Nieves; Loredo, Jorge

    2016-11-15

    The implementation of a risk assessment tool with the capacity to evaluate the risks for health, safety and the environment (HSE) from extraction of non-conventional fossil fuel resources by the hydraulic fracturing (fracking) technique can be a useful tool to boost development and progress of the technology and winning public trust and acceptance of this. At the early project stages, the lack of data related the selection of non-conventional gas deposits makes it difficult the use of existing approaches to risk assessment of fluids injected into geologic formations. The qualitative risk assessment tool developed in this work is based on the approach that shale gas exploitation risk is dependent on both the geologic site and the technological aspects. It follows from the Oldenburg's 'Screening and Ranking Framework (SRF)' developed to evaluate potential geologic carbon dioxide (CO2) storage sites. These two global characteristics: (1) characteristics centered on the natural aspects of the site and (2) characteristics centered on the technological aspects of the Project, have been evaluated through user input of Property values, which define Attributes, which define the Characteristics. In order to carry out an individual evaluation of each of the characteristics and the elements of the model, the tool has been implemented in a spreadsheet. The proposed model has been applied to a site with potential for the exploitation of shale gas in Asturias (northwestern Spain) with tree different technological options to test the approach. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Development and evaluation of the microbial fate and transport module for the Agricultural Policy/Environmental eXtender (APEX) model

    USDA-ARS?s Scientific Manuscript database

    Microbial contamination of waters in agricultural watershed is the critical public health issue. The watershed-scale model has been proven to be one of the candidate tools for predicting microbial water quality and evaluating management practices. The Agricultural Policy/Environmental eXtender (APEX...

  19. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  20. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  1. Peer Review Documents Related to the Evaluation of ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.

  2. Cardiac catheterization laboratory inpatient forecast tool: a prospective evaluation

    PubMed Central

    Flanagan, Eleni; Siddiqui, Sauleh; Appelbaum, Jeff; Kasper, Edward K; Levin, Scott

    2016-01-01

    Objective To develop and prospectively evaluate a web-based tool that forecasts the daily bed need for admissions from the cardiac catheterization laboratory using routinely available clinical data within electronic medical records (EMRs). Methods The forecast model was derived using a 13-month retrospective cohort of 6384 catheterization patients. Predictor variables such as demographics, scheduled procedures, and clinical indicators mined from free-text notes were input to a multivariable logistic regression model that predicted the probability of inpatient admission. The model was embedded into a web-based application connected to the local EMR system and used to support bed management decisions. After implementation, the tool was prospectively evaluated for accuracy on a 13-month test cohort of 7029 catheterization patients. Results The forecast model predicted admission with an area under the receiver operating characteristic curve of 0.722. Daily aggregate forecasts were accurate to within one bed for 70.3% of days and within three beds for 97.5% of days during the prospective evaluation period. The web-based application housing the forecast model was used by cardiology providers in practice to estimate daily admissions from the catheterization laboratory. Discussion The forecast model identified older age, male gender, invasive procedures, coronary artery bypass grafts, and a history of congestive heart failure as qualities indicating a patient was at increased risk for admission. Diagnostic procedures and less acute clinical indicators decreased patients’ risk of admission. Despite the site-specific limitations of the model, these findings were supported by the literature. Conclusion Data-driven predictive analytics may be used to accurately forecast daily demand for inpatient beds for cardiac catheterization patients. Connecting these analytics to EMR data sources has the potential to provide advanced operational decision support. PMID:26342217

  3. ED leadership competency matrix: an administrative management tool.

    PubMed

    Propp, Douglas A; Glickman, Seth; Uehara, Dennis T

    2003-10-01

    A successful ED relies on its leaders to master and demonstrate core competencies to be effective in the many arenas in which they interact and are responsible. A unique matrix model for the assessment of an ED leadership's key administrative skill sets is presented. The model incorporates capabilities related to the individual's cognitive aptitude, experience, acquired technical skills, behavioral characteristics, as well as the ability to manage relationships effectively. Based on the personnel inventory using the matrix, focused evaluation, development, and recruitment of ED key leaders occurs. This dynamic tool has provided a unique perspective for the evaluation and enhancement of overall ED leadership performance. It is hoped that incorporation of such a model will similarly improve the accomplishments of EDs at other institutions.

  4. A GIS-BASED MODAL MODEL OF AUTOMOBILE EXHAUST EMISSIONS

    EPA Science Inventory

    The report presents progress toward the development of a computer tool called MEASURE, the Mobile Emission Assessment System for Urban and Regional Evaluation. The tool works toward a goal of providing researchers and planners with a way to assess new mobile emission mitigation s...

  5. Evaluation of a Digital Library by Means of Quality Function Deployment (QFD) and the Kano Model

    ERIC Educational Resources Information Center

    Garibay, Cecilia; Gutierrez, Humberto; Figueroa, Arturo

    2010-01-01

    This paper proposes utilizing a combination of the Quality Function Deployment (QFD)-Kano model as a useful tool to evaluate service quality. The digital library of the University of Guadalajara (Mexico) is presented as a case study. Data to feed the QFD-Kano model was gathered by an online questionnaire that was made available to users on the…

  6. Comparative study of coated and uncoated tool inserts with dry machining of EN47 steel using Taguchi L9 optimization technique

    NASA Astrophysics Data System (ADS)

    Vasu, M.; Shivananda, Nayaka H.

    2018-04-01

    EN47 steel samples are machined on a self-centered lathe using Chemical Vapor Deposition of coated TiCN/Al2O3/TiN and uncoated tungsten carbide tool inserts, with nose radius 0.8mm. Results are compared with each other and optimized using statistical tool. Input (cutting) parameters that are considered in this work are feed rate (f), cutting speed (Vc), and depth of cut (ap), the optimization criteria are based on the Taguchi (L9) orthogonal array. ANOVA method is adopted to evaluate the statistical significance and also percentage contribution for each model. Multiple response characteristics namely cutting force (Fz), tool tip temperature (T) and surface roughness (Ra) are evaluated. The results discovered that coated tool insert (TiCN/Al2O3/TiN) exhibits 1.27 and 1.29 times better than the uncoated tool insert for tool tip temperature and surface roughness respectively. A slight increase in cutting force was observed for coated tools.

  7. Direct Numerical Simulations of Diffusive Staircases in the Arctic

    DTIC Science & Technology

    2009-03-01

    modeling is the simplest and most obvious tool for evaluating the mixing characteristics in the Arctic Ocean, and it will be extensively used in our...and Kinglear, in addition to Department of Defense (DoD) supercomputer clusters, Babbage, Davinci , and Midnight. Low resolution model runs were...Krishfield, R., Toole , J., Proshutinsky, A., & Timmermans, M.-L. (2008). Automated Ice Tethered Profilers for seawater observations under pack ice in

  8. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers.

    PubMed

    Shu, Jie; Dolman, G E; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-04-27

    Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To speed up the training and detection processes, we removed luminance channel, Y channel of YCbCr colour space and chose 128 histogram bins which is the optimal number. A maximum likelihood classifier is used to classify pixels in digital slides into positively or negatively stained pixels automatically. The model-based tool was developed within ImageJ to quantify targets identified using IHC and histochemistry. The purpose of evaluation was to compare the computer model with human evaluation. Several large datasets were prepared and obtained from human oesophageal cancer, colon cancer and liver cirrhosis with different colour stains. Experimental results have demonstrated the model-based tool achieves more accurate results than colour deconvolution and CMYK model in the detection of brown colour, and is comparable to colour deconvolution in the detection of pink colour. We have also demostrated the proposed model has little inter-dataset variations. A robust and effective statistical model is introduced in this paper. The model-based interactive tool in ImageJ, which can create a visual representation of the statistical model and detect a specified colour automatically, is easy to use and available freely at http://rsb.info.nih.gov/ij/plugins/ihc-toolbox/index.html . Testing to the tool by different users showed only minor inter-observer variations in results.

  9. Linear modeling of human hand-arm dynamics relevant to right-angle torque tool interaction.

    PubMed

    Ay, Haluk; Sommerich, Carolyn M; Luscher, Anthony F

    2013-10-01

    A new protocol was evaluated for identification of stiffness, mass, and damping parameters employing a linear model for human hand-arm dynamics relevant to right-angle torque tool use. Powered torque tools are widely used to tighten fasteners in manufacturing industries. While these tools increase accuracy and efficiency of tightening processes, operators are repetitively exposed to impulsive forces, posing risk of upper extremity musculoskeletal injury. A novel testing apparatus was developed that closely mimics biomechanical exposure in torque tool operation. Forty experienced torque tool operators were tested with the apparatus to determine model parameters and validate the protocol for physical capacity assessment. A second-order hand-arm model with parameters extracted in the time domain met model accuracy criterion of 5% for time-to-peak displacement error in 93% of trials (vs. 75% for frequency domain). Average time-to-peak handle displacement and relative peak handle force errors were 0.69 ms and 0.21%, respectively. Model parameters were significantly affected by gender and working posture. Protocol and numerical calculation procedures provide an alternative method for assessing mechanical parameters relevant to right-angle torque tool use. The protocol more closely resembles tool use, and calculation procedures demonstrate better performance of parameter extraction using time domain system identification methods versus frequency domain. Potential future applications include parameter identification for in situ torque tool operation and equipment development for human hand-arm dynamics simulation under impulsive forces that could be used for assessing torque tools based on factors relevant to operator health (handle dynamics and hand-arm reaction force).

  10. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  11. RHydro - Hydrological models and tools to represent and analyze hydrological data in R

    NASA Astrophysics Data System (ADS)

    Reusser, Dominik; Buytaert, Wouter

    2010-05-01

    In hydrology, basic equations and procedures keep being implemented from scratch by scientist, with the potential for errors and inefficiency. The use of libraries can overcome these problems. Other scientific disciplines such as mathematics and physics have benefited significantly from such an approach with freely available implementations for many routines. As an example, hydrological libraries could contain: Major representations of hydrological processes such as infiltration, sub-surface runoff and routing algorithms. Scaling functions, for instance to combine remote sensing precipitation fields with rain gauge data Data consistency checks Performance measures. Here we present a beginning for such a library implemented in the high level data programming language R. Currently, Top-model, data import routines for WaSiM-ETH as well basic visualization and evaluation tools are implemented. The design is such, that a definition of import scripts for additional models is sufficient to have access to the full set of evaluation and visualization tools.

  12. Evaluation of the probability of arrester failure in a high-voltage transmission line using a Q learning artificial neural network model

    NASA Astrophysics Data System (ADS)

    Ekonomou, L.; Karampelas, P.; Vita, V.; Chatzarakis, G. E.

    2011-04-01

    One of the most popular methods of protecting high voltage transmission lines against lightning strikes and internal overvoltages is the use of arresters. The installation of arresters in high voltage transmission lines can prevent or even reduce the lines' failure rate. Several studies based on simulation tools have been presented in order to estimate the critical currents that exceed the arresters' rated energy stress and to specify the arresters' installation interval. In this work artificial intelligence, and more specifically a Q-learning artificial neural network (ANN) model, is addressed for evaluating the arresters' failure probability. The aims of the paper are to describe in detail the developed Q-learning ANN model and to compare the results obtained by its application in operating 150 kV Greek transmission lines with those produced using a simulation tool. The satisfactory and accurate results of the proposed ANN model can make it a valuable tool for designers of electrical power systems seeking more effective lightning protection, reducing operational costs and better continuity of service.

  13. Tool wear modeling using abductive networks

    NASA Astrophysics Data System (ADS)

    Masory, Oren

    1992-09-01

    A tool wear model based on Abductive Networks, which consists of a network of `polynomial' nodes, is described. The model relates the cutting parameters, components of the cutting force, and machining time to flank wear. Thus real time measurements of the cutting force can be used to monitor the machining process. The model is obtained by a training process in which the connectivity between the network's nodes and the polynomial coefficients of each node are determined by optimizing a performance criteria. Actual wear measurements of coated and uncoated carbide inserts were used for training and evaluating the established model.

  14. Geospatial Information System Capability Maturity Models

    DOT National Transportation Integrated Search

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  15. Using Computational Modeling to Assess the Impact of Clinical Decision Support on Cancer Screening within Community Health Centers

    PubMed Central

    Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.

    2014-01-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241

  16. Evaluating O, C, and N isotopes in human hair as a forensic tool to reconstruct travel

    NASA Astrophysics Data System (ADS)

    Ehleringer, Jim; Chesson, Lesley; Cerling, Thure; Valenzuela, Luciano

    2014-05-01

    Oxygen isotope ratios in the proteins of human scalp hair have been proposed and modeled as a tool for reconstructing the movements of humans and evaluating the likelihood that an individual is a resident or non-resident of a particular geographic region. Carbon and nitrogen isotope ratios reflect dietary input and complement oxygen isotope data interpretation when it is necessary to distinguish potential location overlap among continents. The combination of a time sequence analysis in hair segments and spatial models that describe predicted geographic variation in hair isotope values represents a potentially powerful tool for forensic investigations. The applications of this technique have thus far been to provide assistance to law enforcement with information on the predicted geographical travel histories of unidentified murder victims. Here we review multiple homicide cases from the USA where stable isotope analysis of hair has been applied and for which we now know the travel histories of the murder victims. Here we provide information on the robustness of the original data sets used to test these models by evaluating the travel histories of randomly collected hair discarded in Utah barbershops.

  17. Improved alignment evaluation and optimization : final report.

    DOT National Transportation Integrated Search

    2007-09-11

    This report outlines the development of an enhanced highway alignment evaluation and optimization : model. A GIS-based software tool is prepared for alignment optimization that uses genetic algorithms for : optimal search. The software is capable of ...

  18. Irena : tool suite for modeling and analysis of small-angle scattering.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less

  19. Overcoming redundancies in bedside nursing assessments by validating a parsimonious meta-tool: findings from a methodological exercise study.

    PubMed

    Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca

    2016-10-01

    There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.

  20. Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning

    NASA Astrophysics Data System (ADS)

    Thomas, S. M.; Su, Y. C.; Hummel, P. R.

    2016-12-01

    Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance using BMP/LID. The project has gained national attention, being showcased in multiple newsletters, professional magazines, and conference presentations. The project also won the Texas American Council of Engineering Companies (ACEC) Gold Medal Award and the ACEC National Recognition Award in 2016.

  1. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  2. Evaluating opportunities to improve material and energy impacts in commodity supply chains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanes, Rebecca J.; Carpenter, Alberta

    When evaluated at the scale of individual processes, next-generation technologies may be more energy and emissions intensive than current technology. Furthermore, many advanced technologies have the potential to reduce material and energy consumption in upstream or downstream processing stages. In order to fully understand the benefits and consequences of technology deployment, next-generation technologies should be evaluated in context, as part of a supply chain. This work presents the Materials Flow through Industry (MFI) supply chain modeling tool. The MFI tool is a cradle-to-gate linear network model of the US industrial sector that can model a wide range of manufacturing scenarios,more » including changes in production technology and increases in industrial energy efficiency. The MFI tool was developed to perform supply chain scale analyses in order to quantify the impacts and benefits of next-generation technologies and materials at that scale. For the analysis presented in this paper, the MFI tool is utilized to explore a case study comparing three lightweight vehicle supply chains to the supply chain of a conventional, standard weight vehicle. Several of the lightweight vehicle supply chains are evaluated under manufacturing scenarios that include next-generation production technologies and next-generation materials. Results indicate that producing lightweight vehicles is more energy and emission intensive than producing the non-lightweight vehicle, but the fuel saved during vehicle use offsets this increase. In this case study, greater reductions in supply chain energy and emissions were achieved through the application of the next-generation technologies than from application of energy efficiency increases.« less

  3. Evaluating opportunities to improve material and energy impacts in commodity supply chains

    DOE PAGES

    Hanes, Rebecca J.; Carpenter, Alberta

    2017-01-10

    When evaluated at the scale of individual processes, next-generation technologies may be more energy and emissions intensive than current technology. Furthermore, many advanced technologies have the potential to reduce material and energy consumption in upstream or downstream processing stages. In order to fully understand the benefits and consequences of technology deployment, next-generation technologies should be evaluated in context, as part of a supply chain. This work presents the Materials Flow through Industry (MFI) supply chain modeling tool. The MFI tool is a cradle-to-gate linear network model of the US industrial sector that can model a wide range of manufacturing scenarios,more » including changes in production technology and increases in industrial energy efficiency. The MFI tool was developed to perform supply chain scale analyses in order to quantify the impacts and benefits of next-generation technologies and materials at that scale. For the analysis presented in this paper, the MFI tool is utilized to explore a case study comparing three lightweight vehicle supply chains to the supply chain of a conventional, standard weight vehicle. Several of the lightweight vehicle supply chains are evaluated under manufacturing scenarios that include next-generation production technologies and next-generation materials. Results indicate that producing lightweight vehicles is more energy and emission intensive than producing the non-lightweight vehicle, but the fuel saved during vehicle use offsets this increase. In this case study, greater reductions in supply chain energy and emissions were achieved through the application of the next-generation technologies than from application of energy efficiency increases.« less

  4. TEAM-HF Cost-Effectiveness Model: A Web-Based Program Designed to Evaluate the Cost-Effectiveness of Disease Management Programs in Heart Failure

    PubMed Central

    Reed, Shelby D.; Neilson, Matthew P.; Gardner, Matthew; Li, Yanhong; Briggs, Andrew H.; Polsky, Daniel E.; Graham, Felicia L.; Bowers, Margaret T.; Paul, Sara C.; Granger, Bradi B.; Schulman, Kevin A.; Whellan, David J.; Riegel, Barbara; Levy, Wayne C.

    2015-01-01

    Background Heart failure disease management programs can influence medical resource use and quality-adjusted survival. Because projecting long-term costs and survival is challenging, a consistent and valid approach to extrapolating short-term outcomes would be valuable. Methods We developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Cost-Effectiveness Model, a Web-based simulation tool designed to integrate data on demographic, clinical, and laboratory characteristics, use of evidence-based medications, and costs to generate predicted outcomes. Survival projections are based on a modified Seattle Heart Failure Model (SHFM). Projections of resource use and quality of life are modeled using relationships with time-varying SHFM scores. The model can be used to evaluate parallel-group and single-cohort designs and hypothetical programs. Simulations consist of 10,000 pairs of virtual cohorts used to generate estimates of resource use, costs, survival, and incremental cost-effectiveness ratios from user inputs. Results The model demonstrated acceptable internal and external validity in replicating resource use, costs, and survival estimates from 3 clinical trials. Simulations to evaluate the cost-effectiveness of heart failure disease management programs across 3 scenarios demonstrate how the model can be used to design a program in which short-term improvements in functioning and use of evidence-based treatments are sufficient to demonstrate good long-term value to the health care system. Conclusion The TEAM-HF Cost-Effectiveness Model provides researchers and providers with a tool for conducting long-term cost-effectiveness analyses of disease management programs in heart failure. PMID:26542504

  5. Toward improving hurricane forecasts using the JPL Tropical Cyclone Information System (TCIS): A framework to address the issues of Big Data

    NASA Astrophysics Data System (ADS)

    Hristova-Veleva, S. M.; Boothe, M.; Gopalakrishnan, S.; Haddad, Z. S.; Knosp, B.; Lambrigtsen, B.; Li, P.; montgomery, M. T.; Niamsuwan, N.; Tallapragada, V. S.; Tanelli, S.; Turk, J.; Vukicevic, T.

    2013-12-01

    Accurate forecasting of extreme weather requires the use of both regional models as well as global General Circulation Models (GCMs). The regional models have higher resolution and more accurate physics - two critical components needed for properly representing the key convective processes. GCMs, on the other hand, have better depiction of the large-scale environment and, thus, are necessary for properly capturing the important scale interactions. But how to evaluate the models, understand their shortcomings and improve them? Satellite observations can provide invaluable information. And this is where the issues of Big Data come: satellite observations are very complex and have large variety while model forecast are very voluminous. We are developing a system - TCIS - that addresses the issues of model evaluation and process understanding with the goal of improving the accuracy of hurricane forecasts. This NASA/ESTO/AIST-funded project aims at bringing satellite/airborne observations and model forecasts into a common system and developing on-line tools for joint analysis. To properly evaluate the models we go beyond the comparison of the geophysical fields. We input the model fields into instrument simulators (NEOS3, CRTM, etc.) and compute synthetic observations for a more direct comparison to the observed parameters. In this presentation we will start by describing the scientific questions. We will then outline our current framework to provide fusion of models and observations. Next, we will illustrate how the system can be used to evaluate several models (HWRF, GFS, ECMWF) by applying a couple of our analysis tools to several hurricanes observed during the 2013 season. Finally, we will outline our future plans. Our goal is to go beyond the image comparison and point-by-point statistics, by focusing instead on understanding multi-parameter correlations and providing robust statistics. By developing on-line analysis tools, our framework will allow for consistent model evaluation, providing results that are much more robust than those produced by case studies - the current paradigm imposed by the Big Data issues (voluminous data and incompatible analysis tools). We believe that this collaborative approach, with contributions of models, observations and analysis approaches used by the research and operational communities, will help untangle the complex interactions that lead to hurricane genesis and rapid intensity changes - two processes that still pose many unanswered questions. The developed framework for evaluation of the global models will also have implications for the improvement of the climate models, which output only a limited amount of information making it difficult to evaluate them. Our TCIS will help by investigating the GCMs under current weather scenarios and with much more detailed model output, making it possible to compare the models to multiple observed parameters to help narrow down the uncertainty in their performance. This knowledge could then be transferred to the climate models to lower the uncertainty in their predictions. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.

  6. Bio-AIMS Collection of Chemoinformatics Web Tools based on Molecular Graph Information and Artificial Intelligence Models.

    PubMed

    Munteanu, Cristian R; Gonzalez-Diaz, Humberto; Garcia, Rafael; Loza, Mabel; Pazos, Alejandro

    2015-01-01

    The molecular information encoding into molecular descriptors is the first step into in silico Chemoinformatics methods in Drug Design. The Machine Learning methods are a complex solution to find prediction models for specific biological properties of molecules. These models connect the molecular structure information such as atom connectivity (molecular graphs) or physical-chemical properties of an atom/group of atoms to the molecular activity (Quantitative Structure - Activity Relationship, QSAR). Due to the complexity of the proteins, the prediction of their activity is a complicated task and the interpretation of the models is more difficult. The current review presents a series of 11 prediction models for proteins, implemented as free Web tools on an Artificial Intelligence Model Server in Biosciences, Bio-AIMS (http://bio-aims.udc.es/TargetPred.php). Six tools predict protein activity, two models evaluate drug - protein target interactions and the other three calculate protein - protein interactions. The input information is based on the protein 3D structure for nine models, 1D peptide amino acid sequence for three tools and drug SMILES formulas for two servers. The molecular graph descriptor-based Machine Learning models could be useful tools for in silico screening of new peptides/proteins as future drug targets for specific treatments.

  7. What Lies Beneath? An Evaluation of Rapid Assessment Tools for Management of Hull Fouling

    NASA Astrophysics Data System (ADS)

    Clarke Murray, Cathryn; Therriault, Thomas W.; Pakhomov, Evgeny

    2013-08-01

    Despite an increased understanding of marine invasions, non-indigenous species (NIS) continue to be redistributed at both global and regional scales. Since prevention is an important element of NIS programs, monitoring vectors responsible for NIS introductions and spread, such as hull fouling, has become a priority and methods should be selected carefully to balance accuracy, time, and cost. Two common fouling assessment tools for the marine recreational boating vector were evaluated for accuracy using a traditional underwater SCUBA survey in coastal British Columbia: a dockside level of fouling assessment and a behavioral questionnaire model. Results showed that although rapid, dockside assessments did not provide an accurate assessment of fouling present below the surface, at least not in this region. In contrast, a questionnaire-based model using four easily obtained variables (boat type, age of antifouling paint, storage type, and occurrence of long distance trips) reliably identified boats carrying macrofouling species, a proxy for risk of NIS transport. Once validated, this fouling model tool could be applied in border inspection or quarantine situations where decisions must be made quickly. Further development and refinement of rapid assessment tools would improve our ability to prevent new introductions and manage spread of existing invasive species.

  8. What lies beneath? An evaluation of rapid assessment tools for management of hull fouling.

    PubMed

    Clarke Murray, Cathryn; Therriault, Thomas W; Pakhomov, Evgeny

    2013-08-01

    Despite an increased understanding of marine invasions, non-indigenous species (NIS) continue to be redistributed at both global and regional scales. Since prevention is an important element of NIS programs, monitoring vectors responsible for NIS introductions and spread, such as hull fouling, has become a priority and methods should be selected carefully to balance accuracy, time, and cost. Two common fouling assessment tools for the marine recreational boating vector were evaluated for accuracy using a traditional underwater SCUBA survey in coastal British Columbia: a dockside level of fouling assessment and a behavioral questionnaire model. Results showed that although rapid, dockside assessments did not provide an accurate assessment of fouling present below the surface, at least not in this region. In contrast, a questionnaire-based model using four easily obtained variables (boat type, age of antifouling paint, storage type, and occurrence of long distance trips) reliably identified boats carrying macrofouling species, a proxy for risk of NIS transport. Once validated, this fouling model tool could be applied in border inspection or quarantine situations where decisions must be made quickly. Further development and refinement of rapid assessment tools would improve our ability to prevent new introductions and manage spread of existing invasive species.

  9. Water flow algorithm decision support tool for travelling salesman problem

    NASA Astrophysics Data System (ADS)

    Kamarudin, Anis Aklima; Othman, Zulaiha Ali; Sarim, Hafiz Mohd

    2016-08-01

    This paper discuss about the role of Decision Support Tool in Travelling Salesman Problem (TSP) for helping the researchers who doing research in same area will get the better result from the proposed algorithm. A study has been conducted and Rapid Application Development (RAD) model has been use as a methodology which includes requirement planning, user design, construction and cutover. Water Flow Algorithm (WFA) with initialization technique improvement is used as the proposed algorithm in this study for evaluating effectiveness against TSP cases. For DST evaluation will go through usability testing conducted on system use, quality of information, quality of interface and overall satisfaction. Evaluation is needed for determine whether this tool can assists user in making a decision to solve TSP problems with the proposed algorithm or not. Some statistical result shown the ability of this tool in term of helping researchers to conduct the experiments on the WFA with improvements TSP initialization.

  10. Information quality-control model

    NASA Technical Reports Server (NTRS)

    Vincent, D. A.

    1971-01-01

    Model serves as graphic tool for estimating complete product objectives from limited input information, and is applied to cost estimations, product-quality evaluations, and effectiveness measurements for manpower resources allocation. Six product quality levels are defined.

  11. AUTOMATIC CALIBRATION OF A STOCHASTIC-LAGRANGIAN TRANSPORT MODEL (SLAM)

    EPA Science Inventory

    Numerical models are a useful tool in evaluating and designing NAPL remediation systems. Traditional constitutive finite difference and finite element models are complex and expensive to apply. For this reason, this paper presents the application of a simplified stochastic-Lagran...

  12. Travel demand modeling for the small and medium sized MPOs in Illinois.

    DOT National Transportation Integrated Search

    2011-09-01

    Travel demand modeling is an important tool in the transportation planning community. It helps forecast travel : characteristics into the future at various planning levels such as state, region and corridor. Using travel demand : modeling to evaluate...

  13. A Simulation Tool for Dynamic Contrast Enhanced MRI

    PubMed Central

    Mauconduit, Franck; Christen, Thomas; Barbier, Emmanuel Luc

    2013-01-01

    The quantification of bolus-tracking MRI techniques remains challenging. The acquisition usually relies on one contrast and the analysis on a simplified model of the various phenomena that arise within a voxel, leading to inaccurate perfusion estimates. To evaluate how simplifications in the interstitial model impact perfusion estimates, we propose a numerical tool to simulate the MR signal provided by a dynamic contrast enhanced (DCE) MRI experiment. Our model encompasses the intrinsic and relaxations, the magnetic field perturbations induced by susceptibility interfaces (vessels and cells), the diffusion of the water protons, the blood flow, the permeability of the vessel wall to the the contrast agent (CA) and the constrained diffusion of the CA within the voxel. The blood compartment is modeled as a uniform compartment. The different blocks of the simulation are validated and compared to classical models. The impact of the CA diffusivity on the permeability and blood volume estimates is evaluated. Simulations demonstrate that the CA diffusivity slightly impacts the permeability estimates ( for classical blood flow and CA diffusion). The effect of long echo times is investigated. Simulations show that DCE-MRI performed with an echo time may already lead to significant underestimation of the blood volume (up to 30% lower for brain tumor permeability values). The potential and the versatility of the proposed implementation are evaluated by running the simulation with realistic vascular geometry obtained from two photons microscopy and with impermeable cells in the extravascular environment. In conclusion, the proposed simulation tool describes DCE-MRI experiments and may be used to evaluate and optimize acquisition and processing strategies. PMID:23516414

  14. Latest Community Coordinated Modeling Center (CCMC) services and innovative tools supporting the space weather research and operational communities.

    NASA Astrophysics Data System (ADS)

    Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.

    2017-12-01

    The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).

  15. An Intelligent Crop Planning Tool for Controlled Ecological Life Support Systems

    NASA Technical Reports Server (NTRS)

    Whitaker, Laura O.; Leon, Jorge

    1996-01-01

    This paper describes a crop planning tool developed for the Controlled Ecological Life Support Systems (CELSS) project which is in the research phases at various NASA facilities. The Crop Planning Tool was developed to assist in the understanding of the long term applications of a CELSS environment. The tool consists of a crop schedule generator as well as a crop schedule simulator. The importance of crop planning tools such as the one developed is discussed. The simulator is outlined in detail while the schedule generator is touched upon briefly. The simulator consists of data inputs, plant and human models, and various other CELSS activity models such as food consumption and waste regeneration. The program inputs such as crew data and crop states are discussed. References are included for all nominal parameters used. Activities including harvesting, planting, plant respiration, and human respiration are discussed using mathematical models. Plans provided to the simulator by the plan generator are evaluated for their 'fitness' to the CELSS environment with an objective function based upon daily reservoir levels. Sample runs of the Crop Planning Tool and future needs for the tool are detailed.

  16. Cost effectiveness of pediatric pneumococcal conjugate vaccines: a comparative assessment of decision-making tools.

    PubMed

    Chaiyakunapruk, Nathorn; Somkrua, Ratchadaporn; Hutubessy, Raymond; Henao, Ana Maria; Hombach, Joachim; Melegaro, Alessia; Edmunds, John W; Beutels, Philippe

    2011-05-12

    Several decision support tools have been developed to aid policymaking regarding the adoption of pneumococcal conjugate vaccine (PCV) into national pediatric immunization programs. The lack of critical appraisal of these tools makes it difficult for decision makers to understand and choose between them. With the aim to guide policymakers on their optimal use, we compared publicly available decision-making tools in relation to their methods, influential parameters and results. The World Health Organization (WHO) requested access to several publicly available cost-effectiveness (CE) tools for PCV from both public and private provenance. All tools were critically assessed according to the WHO's guide for economic evaluations of immunization programs. Key attributes and characteristics were compared and a series of sensitivity analyses was performed to determine the main drivers of the results. The results were compared based on a standardized set of input parameters and assumptions. Three cost-effectiveness modeling tools were provided, including two cohort-based (Pan-American Health Organization (PAHO) ProVac Initiative TriVac, and PneumoADIP) and one population-based model (GlaxoSmithKline's SUPREMES). They all compared the introduction of PCV into national pediatric immunization program with no PCV use. The models were different in terms of model attributes, structure, and data requirement, but captured a similar range of diseases. Herd effects were estimated using different approaches in each model. The main driving parameters were vaccine efficacy against pneumococcal pneumonia, vaccine price, vaccine coverage, serotype coverage and disease burden. With a standardized set of input parameters developed for cohort modeling, TriVac and PneumoADIP produced similar incremental costs and health outcomes, and incremental cost-effectiveness ratios. Vaccine cost (dose price and number of doses), vaccine efficacy and epidemiology of critical endpoint (for example, incidence of pneumonia, distribution of serotypes causing pneumonia) were influential parameters in the models we compared. Understanding the differences and similarities of such CE tools through regular comparisons could render decision-making processes in different countries more efficient, as well as providing guiding information for further clinical and epidemiological research. A tool comparison exercise using standardized data sets can help model developers to be more transparent about their model structure and assumptions and provide analysts and decision makers with a more in-depth view behind the disease dynamics. Adherence to the WHO guide of economic evaluations of immunization programs may also facilitate this process. Please see related article: http://www.biomedcentral.com/1741-7007/9/55.

  17. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  18. FACET: Future ATM Concepts Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Bilmoria, Karl D.; Banavar, Sridhar; Chatterji, Gano B.; Sheth, Kapil S.; Grabbe, Shon

    2000-01-01

    FACET (Future ATM Concepts Evaluation Tool) is an Air Traffic Management research tool being developed at the NASA Ames Research Center. This paper describes the design, architecture and functionalities of FACET. The purpose of FACET is to provide E simulation environment for exploration, development and evaluation of advanced ATM concepts. Examples of these concepts include new ATM paradigms such as Distributed Air-Ground Traffic Management, airspace redesign and new Decision Support Tools (DSTs) for controllers working within the operational procedures of the existing air traffic control system. FACET is currently capable of modeling system-wide en route airspace operations over the contiguous United States. Airspace models (e.g., Center/sector boundaries, airways, locations of navigation aids and airports) are available from databases. A core capability of FACET is the modeling of aircraft trajectories. Using round-earth kinematic equations, aircraft can be flown along flight plan routes or great circle routes as they climb, cruise and descend according to their individual aircraft-type performance models. Performance parameters (e.g., climb/descent rates and speeds, cruise speeds) are obtained from data table lookups. Heading, airspeed and altitude-rate dynamics are also modeled. Additional functionalities will be added as necessary for specific applications. FACET software is written in Java and C programming languages. It is platform-independent, and can be run on a variety of computers. FACET has been designed with a modular software architecture to enable rapid integration of research prototype implementations of new ATM concepts. There are several advanced ATM concepts that are currently being implemented in FACET airborne separation assurance, dynamic density predictions, airspace redesign (re-sectorization), benefits of a controller DST for direct-routing, and the integration of commercial space transportation system operations into the U.S. National Airspace System (NAS).

  19. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less

  20. What Did the Teachers Think? Teachers' Responses to the Use of Value-Added Modeling as a Tool for Evaluating Teacher Effectiveness

    ERIC Educational Resources Information Center

    Lee, Linda

    2011-01-01

    The policy discourse on improving student achievement has shifted from student outcomes to focusing on evaluating teacher effectiveness using standardized test scores. A major urban newspaper released a public database that ranked teachers' effectiveness using Value-Added Modeling. Teachers, whom are generally marginalized, were given the…

  1. Evaluation of CMIP5 Ability to Reproduce 20th Century Regional Trends in Surface Air Temperature and Precipitation over CONUS

    NASA Astrophysics Data System (ADS)

    Lee, J.; Waliser, D. E.; Lee, H.; Loikith, P. C.; Kunkel, K.

    2017-12-01

    Monitoring temporal changes in key climate variables, such as surface air temperature and precipitation, is an integral part of the ongoing efforts of the United States National Climate Assessment (NCA). Climate models participating in CMIP5 provide future trends for four different emissions scenarios. In order to have confidence in the future projections of surface air temperature and precipitation, it is crucial to evaluate the ability of CMIP5 models to reproduce observed trends for three different time periods (1895-1939, 1940-1979, and 1980-2005). Towards this goal, trends in surface air temperature and precipitation obtained from the NOAA nClimGrid 5 km gridded station observation-based product are compared during all three time periods to the 206 CMIP5 historical simulations from 48 unique GCMs and their multi-model ensemble (MME) for NCA-defined climate regions during summer (JJA) and winter (DJF). This evaluation quantitatively examines the biases of simulated trends of the spatially averaged temperature and precipitation in the NCA climate regions. The CMIP5 MME reproduces historical surface air temperature trends for JJA for all time period and all regions, except the Northern Great Plains from 1895-1939 and Southeast during 1980-2005. Likewise, for DJF, the MME reproduces historical surface air temperature trends across all time periods over all regions except the Southeast from 1895-1939 and the Midwest during 1940-1979. The Regional Climate Model Evaluation System (RCMES), an analysis tool which supports the NCA by providing access to data and tools for regional climate model validation, facilitates the comparisons between the models and observation. The RCMES Toolkit is designed to assist in the analysis of climate variables and the procedure of the evaluation of climate projection models to support the decision-making processes. This tool is used in conjunction with the above analysis and results will be presented to demonstrate its capability to access observation and model datasets, calculate evaluation metrics, and visualize the results. Several other examples of the RCMES capabilities can be found at https://rcmes.jpl.nasa.gov.

  2. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  3. Web-based applications for building, managing and analysing kinetic models of biological systems.

    PubMed

    Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A

    2009-01-01

    Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.

  4. Systematic review of fall risk screening tools for older patients in acute hospitals.

    PubMed

    Matarese, Maria; Ivziku, Dhurata; Bartolozzi, Francesco; Piredda, Michela; De Marinis, Maria Grazia

    2015-06-01

    To determine the most accurate fall risk screening tools for predicting falls among patients aged 65 years or older admitted to acute care hospitals. Falls represent a serious problem in older inpatients due to the potential physical, social, psychological and economic consequences. Older inpatients present with risk factors associated with age-related physiological and psychological changes as well as multiple morbidities. Thus, fall risk screening tools for older adults should include these specific risk factors. There are no published recommendations addressing what tools are appropriate for older hospitalized adults. Systematic review. MEDLINE, CINAHL and Cochrane electronic databases were searched between January 1981-April 2013. Only prospective validation studies reporting sensitivity and specificity values were included. Recommendations of the Cochrane Handbook of Diagnostic Test Accuracy Reviews have been followed. Three fall risk assessment tools were evaluated in seven articles. Due to the limited number of studies, meta-analysis was carried out only for the STRATIFY and Hendrich Fall Risk Model II. In the combined analysis, the Hendrich Fall Risk Model II demonstrated higher sensitivity than STRATIFY, while the STRATIFY showed higher specificity. In both tools, the Youden index showed low prognostic accuracy. The identified tools do not demonstrate predictive values as high as needed for identifying older inpatients at risk for falls. For this reason, no tool can be recommended for fall detection. More research is needed to evaluate fall risk screening tools for older inpatients. © 2014 John Wiley & Sons Ltd.

  5. Obs4MIPS: Satellite Observations for Model Evaluation

    NASA Astrophysics Data System (ADS)

    Ferraro, R.; Waliser, D. E.; Gleckler, P. J.

    2017-12-01

    This poster will review the current status of the obs4MIPs project, whose purpose is to provide a limited collection of well-established and documented datasets for comparison with Earth system models (https://www.earthsystemcog.org/projects/obs4mips/). These datasets have been reformatted to correspond with the CMIP5 model output requirements, and include technical documentation specifically targeted for their use in model output evaluation. The project holdings now exceed 120 datasets with observations that directly correspond to CMIP5 model output variables, with new additions in response to the CMIP6 experiments. With the growth in climate model output data volume, it is increasing more difficult to bring the model output and the observations together to do evaluations. The positioning of the obs4MIPs datasets within the Earth System Grid Federation (ESGF) allows for the use of currently available and planned online tools within the ESGF to perform analysis using model output and observational datasets without necessarily downloading everything to a local workstation. This past year, obs4MIPs has updated its submission guidelines to closely align with changes in the CMIP6 experiments, and is implementing additional indicators and ancillary data to allow users to more easily determine the efficacy of an obs4MIPs dataset for specific evaluation purposes. This poster will present the new guidelines and indicators, and update the list of current obs4MIPs holdings and their connection to the ESGF evaluation and analysis tools currently available, and being developed for the CMIP6 experiments.

  6. NUMERICAL MODELS AS ENABLING TOOLS FOR TIDAL-STREAM ENERGY EXTRACTION AND ENVIRONMENTAL IMPACT ASSESSMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhaoqing; Wang, Taiping

    This paper presents a modeling study conducted to evaluate tidal-stream energy extraction and its associated potential environmental impacts using a three-dimensional unstructured-grid coastal ocean model, which was coupled with a water-quality model and a tidal-turbine module.

  7. State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation

    DTIC Science & Technology

    2014-07-01

    preclude in-depth analysis, and widespread use of a Software -as-a- Service ( SaaS ) model that limits data availability and application to DoD systems...provide mobile application analysis using a Software - as-a- Service ( SaaS ) model. In this case, any software to be analyzed must be sent to the...tools are only available through a SaaS model. The widespread use of a Software -as-a- Service ( SaaS ) model as a sole evaluation model limits data

  8. The Personnel Effectiveness Grid (PEG): A New Tool for Estimating Personnel Department Effectiveness

    ERIC Educational Resources Information Center

    Petersen, Donald J.; Malone, Robert L.

    1975-01-01

    Examines the difficulties inherent in attempting a formal personnel evaluation system, the major formal methods currently used for evaluating personnel department accountabilities, some parameters that should be part of a valid evaluation program, and a model for conducting the evaluation. (Available from Office of Publications, Graduate School of…

  9. The Handbook of Leadership Development Evaluation

    ERIC Educational Resources Information Center

    Hannum, Kelly M., Ed.; Martineau, Jennifer W., Ed.; Reinelt, Claire, Ed.

    2007-01-01

    With the increase in the number of organizational leadership development programs, there is a pressing need for evaluation to answer important questions, improve practice, and inform decisions. The Handbook is a comprehensive resource filled with examples, tools, and the most innovative models and approaches designed to evaluate leadership…

  10. NREL Suite of Tools for PV and Storage Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elgqvist, Emma M; Salasovich, James A

    Many different factors such as the solar resource, technology costs and incentives, utility cost and consumption, space available, and financial parameters impact the technical and economic potential of a PV project. NREL has developed techno-economic modeling tools that can be used to evaluate PV projects at a site.

  11. Evaluation of interactive highway safety design model crash prediction tools for two-lane rural roads on Kansas Department of Transportation projects.

    DOT National Transportation Integrated Search

    2014-01-01

    Historically, project-level decisions for the selection of highway features to promote safety were : based on either engineering judgment or adherence to accepted national guidance. These tools have allowed : highway designers to produce facilities t...

  12. Evaluation of interactive highway safety design model crash prediction tools for two-lane rural roads on Kansas Department of Transportation projects : [technical summary].

    DOT National Transportation Integrated Search

    2014-01-01

    Historically, project-level decisions for the selection of highway features to promote safety were based on either engineering judgment or adherence to accepted national guidance. These tools have allowed highway designers to produce facilities that ...

  13. Marine and Hydrokinetic Research | Water Power | NREL

    Science.gov Websites

    . Resource Characterization and Maps NREL develops measurement systems, simulation tools, and web-based models and tools to evaluate the economic potential of power-generating devices for all technology Acceleration NREL analysts study the potential impacts that developing a robust MHK market could have on

  14. PREDICTING LEVELS OF STRESS FROM BIOLOGICAL ASSESSMENT DATA: EMPIRICAL MODELS FROM THE EASTERN CORN BELT PLAINS

    EPA Science Inventory

    Biological assessment is becoming an increasingly popular tool in the evaluation of stream ecosystem integrity. However, little progress has been made to date in developing tools to relate assessment results to specific stressors. This paper continues the investigation of the f...

  15. Toxicity Evaluation of Engineered Nanomaterials: Risk Evaluation Tools (Phase 3 Studies)

    DTIC Science & Technology

    2012-01-01

    report. The second modeling approach was on quantitative structure activity relationships ( QSARs ). A manuscript entitled “Connecting the dots: Towards...expands rapidly. We proposed two types of mechanisms of toxic action supported by the nano- QSAR model , which collectively govern the toxicity of the...interpretative nano- QSAR model describing toxicity of 18 nano-metal oxides to a HaCaT cell line as a model for dermal exposure. In result, by the comparison of

  16. The Preliminary Design of a Standardized Spacecraft Bus for Small Tactical Satellites (Volume 2)

    DTIC Science & Technology

    1996-11-01

    this requirement, conditions of the model need to be modified to provide some flexibility to the original solution set. In the business world this...time The mission modules modeled in the Modsat computer model are necessarily "generic" in nature to provide both flexibility in design evaluation and...methods employed during the study, the scope of the problem, the value system used to evaluate alternatives, tradeoff studies performed, modeling tools

  17. Launch Vehicle Propulsion Parameter Design Multiple Selection Criteria

    NASA Technical Reports Server (NTRS)

    Shelton, Joey Dewayne

    2004-01-01

    The optimization tool described herein addresses and emphasizes the use of computer tools to model a system and focuses on a concept development approach for a liquid hydrogen/liquid oxygen single-stage-to-orbit system, but more particularly the development of the optimized system using new techniques. This methodology uses new and innovative tools to run Monte Carlo simulations, genetic algorithm solvers, and statistical models in order to optimize a design concept. The concept launch vehicle and propulsion system were modeled and optimized to determine the best design for weight and cost by varying design and technology parameters. Uncertainty levels were applied using Monte Carlo Simulations and the model output was compared to the National Aeronautics and Space Administration Space Shuttle Main Engine. Several key conclusions are summarized here for the model results. First, the Gross Liftoff Weight and Dry Weight were 67% higher for the design case for minimization of Design, Development, Test and Evaluation cost when compared to the weights determined by the minimization of Gross Liftoff Weight case. In turn, the Design, Development, Test and Evaluation cost was 53% higher for optimized Gross Liftoff Weight case when compared to the cost determined by case for minimization of Design, Development, Test and Evaluation cost. Therefore, a 53% increase in Design, Development, Test and Evaluation cost results in a 67% reduction in Gross Liftoff Weight. Secondly, the tool outputs define the sensitivity of propulsion parameters, technology and cost factors and how these parameters differ when cost and weight are optimized separately. A key finding was that for a Space Shuttle Main Engine thrust level the oxidizer/fuel ratio of 6.6 resulted in the lowest Gross Liftoff Weight rather than at 5.2 for the maximum specific impulse, demonstrating the relationships between specific impulse, engine weight, tank volume and tank weight. Lastly, the optimum chamber pressure for Gross Liftoff Weight minimization was 2713 pounds per square inch as compared to 3162 for the Design, Development, Test and Evaluation cost optimization case. This chamber pressure range is close to 3000 pounds per square inch for the Space Shuttle Main Engine.

  18. Evaluation of land use regression models in Detroit, Michigan

    EPA Science Inventory

    Introduction: Land use regression (LUR) models have emerged as a cost-effective tool for characterizing exposure in epidemiologic health studies. However, little critical attention has been focused on validation of these models as a step toward temporal and spatial extension of ...

  19. Web-based tool for visualization of electric field distribution in deep-seated body structures and planning of electroporation-based treatments.

    PubMed

    Marčan, Marija; Pavliha, Denis; Kos, Bor; Forjanič, Tadeja; Miklavčič, Damijan

    2015-01-01

    Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed.

  20. Web-based tool for visualization of electric field distribution in deep-seated body structures and planning of electroporation-based treatments

    PubMed Central

    2015-01-01

    Background Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. Methods In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Results Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. Conclusions The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed. PMID:26356007

  1. Model-based surgical planning and simulation of cranial base surgery.

    PubMed

    Abe, M; Tabuchi, K; Goto, M; Uchino, A

    1998-11-01

    Plastic skull models of seven individual patients were fabricated by stereolithography from three-dimensional data based on computed tomography bone images. Skull models were utilized for neurosurgical planning and simulation in the seven patients with cranial base lesions that were difficult to remove. Surgical approaches and areas of craniotomy were evaluated using the fabricated skull models. In preoperative simulations, hand-made models of the tumors, major vessels and nerves were placed in the skull models. Step-by-step simulation of surgical procedures was performed using actual surgical tools. The advantages of using skull models to plan and simulate cranial base surgery include a better understanding of anatomic relationships, preoperative evaluation of the proposed procedure, increased understanding by the patient and family, and improved educational experiences for residents and other medical staff. The disadvantages of using skull models include the time and cost of making the models. The skull models provide a more realistic tool that is easier to handle than computer-graphic images. Surgical simulation using models facilitates difficult cranial base surgery and may help reduce surgical complications.

  2. A Comparative Analysis of Life-Cycle Assessment Tools for ...

    EPA Pesticide Factsheets

    We identified and evaluated five life-cycle assessment tools that community decision makers can use to assess the environmental and economic impacts of end-of-life (EOL) materials management options. The tools evaluated in this report are waste reduction mode (WARM), municipal solid waste-decision support tool (MSW-DST), solid waste optimization life-cycle framework (SWOLF), environmental assessment system for environmental technologies (EASETECH), and waste and resources assessment for the environment (WRATE). WARM, MSW-DST, and SWOLF were developed for US-specific materials management strategies, while WRATE and EASETECH were developed for European-specific conditions. All of the tools (with the exception of WARM) allow specification of a wide variety of parameters (e.g., materials composition and energy mix) to a varying degree, thus allowing users to model specific EOL materials management methods even outside the geographical domain they are originally intended for. The flexibility to accept user-specified input for a large number of parameters increases the level of complexity and the skill set needed for using these tools. The tools were evaluated and compared based on a series of criteria, including general tool features, the scope of the analysis (e.g., materials and processes included), and the impact categories analyzed (e.g., climate change, acidification). A series of scenarios representing materials management problems currently relevant to c

  3. Measuring infrastructure: A key step in program evaluation and planning.

    PubMed

    Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd

    2016-06-01

    State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, H; Tan, J; Kavanaugh, J

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-timemore » and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding unnecessary manual verification for physicians/dosimetrists. In addition, its nature as a compact and stand-alone tool allows for future extensibility to include additional functions for physicians’ clinical needs.« less

  5. A Cost Analysis Model for Army Sponsored Graduate Dental Education Programs.

    DTIC Science & Technology

    1997-04-01

    characteristics of a good measurement tool ? Cooper and Emory in their textbook, Business Research Methods, state there are three major criteria for evaluating...a measurement tool : validity, reliability, and practicality (Cooper and Emory 1995). Validity can be compartmentalized into internal and external...tremendous expense? The AEGD-1 year program is used extensively as a recruiting tool to encourage senior dental students to join the Army Dental Corps. The

  6. Sobol Sensitivity Analysis: A Tool to Guide the Development and Evaluation of Systems Pharmacology Models

    PubMed Central

    Trame, MN; Lesko, LJ

    2015-01-01

    A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015 PMID:27548289

  7. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    PubMed

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  8. [The application of two occupation health risk assessment models in a wooden furniture manufacturing industry].

    PubMed

    Wang, A H; Leng, P B; Bian, G L; Li, X H; Mao, G C; Zhang, M B

    2016-10-20

    Objective: To explore the applicability of 2 different models of occupational health risk assessment in wooden furniture manufacturing industry. Methods: American EPA inhalation risk model and ICMM model of occupational health risk assessment were conducted to assess occupational health risk in a small wooden furniture enterprises, respectively. Results: There was poor protective measure and equipment of occupational disease in the plant. The concentration of wood dust in the air of two workshops was over occupational exposure limit (OEL) , and the C TWA was 8.9 mg/m 3 and 3.6 mg/m 3 , respectively. According to EPA model, the workers who exposed to benzene in this plant had high risk (9.7×10 -6 ~34.3×10 -6 ) of leukemia, and who exposed to formaldehyde had high risk (11.4 × 10 -6 ) of squamous cell carcinoma. There were inconsistent evaluation results using the ICMM tools of standard-based matrix and calculated risk rating. There were very high risks to be attacked by rhinocarcinoma of the workers who exposed to wood dust for the tool of calculated risk rating, while high risk for the tool of standard-based matrix. For the workers who exposed to noise, risk of noise-induced deafness was unacceptable and medium risk using two tools, respectively. Conclusion: Both EPA model and ICMM model can appropriately predict and assessthe occupational health risk in wooden furniture manufactory, ICMM due to the relatively simple operation, easy evaluation parameters, assessment of occupational - disease - inductive factors comprehensively, and more suitable for wooden furniture production enterprise.

  9. The System Dynamics Model User Sustainability Explorer (SD-MUSE): a user-friendly tool for interpreting system dynamic models

    EPA Science Inventory

    System Dynamics (SD) models are useful for holistic integration of data to evaluate indirect and cumulative effects and inform decisions. Complex SD models can provide key insights into how decisions affect the three interconnected pillars of sustainability. However, the complexi...

  10. GLIMPSE: An integrated assessment model-based tool for coordinated energy and environmental planning

    EPA Science Inventory

    Dan Loughlin will describe the GCAM-USA integrated assessment model and how that model is being improved and integrated into the GLIMPSE decision support system. He will also demonstrate the application of the model to evaluate the emissions and health implications of hypothetica...

  11. Graphical Means for Inspecting Qualitative Models of System Behaviour

    ERIC Educational Resources Information Center

    Bouwer, Anders; Bredeweg, Bert

    2010-01-01

    This article presents the design and evaluation of a tool for inspecting conceptual models of system behaviour. The basis for this research is the Garp framework for qualitative simulation. This framework includes modelling primitives, such as entities, quantities and causal dependencies, which are combined into model fragments and scenarios.…

  12. Refining prognosis in lung cancer: A report on the quality and relevance of clinical prognostic tools

    PubMed Central

    Mahar, Alyson L.; Compton, Carolyn; McShane, Lisa M.; Halabi, Susan; Asamura, Hisao; Rami-Porta, Ramon; Groome, Patti A.

    2015-01-01

    Introduction Accurate, individualized prognostication for lung cancer patients requires the integration of standard patient and pathologic factors, biologic, genetic, and other molecular characteristics of the tumor. Clinical prognostic tools aim to aggregate information on an individual patient to predict disease outcomes such as overall survival, but little is known about their clinical utility and accuracy in lung cancer. Methods A systematic search of the scientific literature for clinical prognostic tools in lung cancer published Jan 1, 1996-Jan 27, 2015 was performed. In addition, web-based resources were searched. A priori criteria determined by the Molecular Modellers Working Group of the American Joint Committee on Cancer were used to investigate the quality and usefulness of tools. Criteria included clinical presentation, model development approaches, validation strategies, and performance metrics. Results Thirty-two prognostic tools were identified. Patients with metastases were the most frequently considered population in non-small cell lung cancer. All tools for small cell lung cancer covered that entire patient population. Included prognostic factors varied considerably across tools. Internal validity was not formally evaluated for most tools and only eleven were evaluated for external validity. Two key considerations were highlighted for tool development: identification of an explicit purpose related to a relevant clinical population and clear decision-points, and prioritized inclusion of established prognostic factors over emerging factors. Conclusions Prognostic tools will contribute more meaningfully to the practice of personalized medicine if better study design and analysis approaches are used in their development and validation. PMID:26313682

  13. EVALUATING HYDROLOGICAL RESPONSE TO ...

    EPA Pesticide Factsheets

    Studies of future management and policy options based on different assumptions provide a mechanism to examine possible outcomes and especially their likely benefits or consequences. Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extensive data requirements and the difficult task of building input parameter files, however, have long been an obstacle to the timely and cost-effective use of such complex models by resource managers. The U.S. EPA Landscape Ecology Branch in collaboration with the USDA-ARS Southwest Watershed Research Center has developed a geographic information system (GIS) tool to facilitate this process. A GIS provides the framework within which spatially distributed data are collected and used to prepare model input files, and model results are evaluated. The Automated Geospatial Watershed Assessment (AGWA) tool uses widely available standardized spatial datasets that can be obtained via the internet at no cost to the user. The data are used to develop input parameter files for KINEROS2 and SWAT, two watershed runoff and erosion simulation models that operate at different spatial and temporal scales. AGWA automates the process of transforming digital data into simulation model results and provides a visualization tool

  14. Conducting systematic reviews of economic evaluations.

    PubMed

    Gomersall, Judith Streak; Jadotte, Yuri Tertilus; Xue, Yifan; Lockwood, Suzi; Riddle, Dru; Preda, Alin

    2015-09-01

    In 2012, a working group was established to review and enhance the Joanna Briggs Institute (JBI) guidance for conducting systematic review of evidence from economic evaluations addressing a question(s) about health intervention cost-effectiveness. The objective is to present the outcomes of the working group. The group conducted three activities to inform the new guidance: review of literature on the utility/futility of systematic reviews of economic evaluations and consideration of its implications for updating the existing methodology; assessment of the critical appraisal tool in the existing guidance against criteria that promotes validity in economic evaluation research and two other commonly used tools; and a workshop. The debate in the literature on the limitations/value of systematic review of economic evidence cautions that systematic reviews of economic evaluation evidence are unlikely to generate one size fits all answers to questions about the cost-effectiveness of interventions and their comparators. Informed by this finding, the working group adjusted the framing of the objectives definition in the existing JBI methodology. The shift is away from defining the objective as to determine one cost-effectiveness measure toward summarizing study estimates of cost-effectiveness and informed by consideration of the included study characteristics (patient, setting, intervention component, etc.), identifying conditions conducive to lowering costs and maximizing health benefits. The existing critical appraisal tool was included in the new guidance. The new guidance includes the recommendation that a tool designed specifically for the purpose of appraising model-based studies be used together with the generic appraisal tool for economic evaluations assessment to evaluate model-based evaluations. The guidance produced by the group offers reviewers guidance for each step of the systematic review process, which are the same steps followed in JBI reviews of other types of evidence. The updated JBI guidance will be useful for researchers wanting to synthesize evidence about economic questions, either as stand-alone reviews or part of comprehensive or mixed method evidence reviews. Although the updated methodology produced by the work of the working group has improved the JBI guidance for systematic reviews of economic evaluations, there are areas where further work is required. These include adjusting the critical appraisal tool to separate out questions addressing intervention cost and effectiveness measurement; providing more explicit guidance for assessing generalizability of findings; and offering a more robust method for evidence synthesis that facilitates achieving the more ambitious review objectives.

  15. Design of Center-TRACON Automation System

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Davis, Thomas J.; Green, Steven

    1993-01-01

    A system for the automated management and control of terminal area traffic, referred to as the Center-TRACON Automation System (CTAS), is being developed at NASA Ames Research Center. In a cooperative program, NASA and FAA have efforts underway to install and evaluate the system at the Denver area and Dallas/Ft. Worth area air traffic control facilities. This paper will review CTAS architecture, and automation functions as well as the integration of CTAS into the existing operational system. CTAS consists of three types of integrated tools that provide computer-generated advisories for both en-route and terminal area controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), generates runway assignments, landing sequences and landing times for all arriving aircraft, including those originating from nearby feeder airports. TMA also assists in runway configuration control and flow management. Another tool, the Descent Advisor (DA), generates clearances for the en-route controllers handling arrival flows to metering gates. The DA's clearances ensure fuel-efficient and conflict free descents to the metering gates at specified crossing times. In the terminal area, the Final Approach Spacing Tool (FAST) provides heading and speed advisories that help controllers produce an accurately spaced flow of aircraft on the final approach course. Data bases consisting of several hundred aircraft performance models, airline preferred operational procedures, and a three dimensional wind model support the operation of CTAS. The first component of CTAS, the Traffic Management Advisor, is being evaluated at the Denver TRACON and the Denver Air Route Traffic Control Center. The second component, the Final Approach Spacing Tool, will be evaluated in several stages at the Dallas/Fort Worth Airport beginning in October 1993. An initial stage of the Descent Advisor tool is being prepared for testing at the Denver Center in late 1994. Operational evaluations of all three integrated CTAS tools are expected to begin at the two field sites in 1995.

  16. simuwatt - A Tablet Based Electronic Auditing Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macumber, Daniel; Parker, Andrew; Lisell, Lars

    2014-05-08

    'simuwatt Energy Auditor' (TM) is a new tablet-based electronic auditing tool that is designed to dramatically reduce the time and cost to perform investment-grade audits and improve quality and consistency. The tool uses the U.S. Department of Energy's OpenStudio modeling platform and integrated Building Component Library to automate modeling and analysis. simuwatt's software-guided workflow helps users gather required data, and provides the data in a standard electronic format that is automatically converted to a baseline OpenStudio model for energy analysis. The baseline energy model is calibrated against actual monthly energy use to ASHRAE Standard 14 guidelines. Energy conservation measures frommore » the Building Component Library are then evaluated using OpenStudio's parametric analysis capability. Automated reporting creates audit documents that describe recommended packages of energy conservation measures. The development of this tool was partially funded by the U.S. Department of Defense's Environmental Security Technology Certification Program. As part of this program, the tool is being tested at 13 buildings on 5 Department of Defense sites across the United States. Results of the first simuwatt audit tool demonstration are presented in this paper.« less

  17. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  18. Infrastructure for the design and fabrication of MEMS for RF/microwave and millimeter wave applications

    NASA Astrophysics Data System (ADS)

    Nerguizian, Vahe; Rafaf, Mustapha

    2004-08-01

    This article describes and provides valuable information for companies and universities with strategies to start fabricating MEMS for RF/Microwave and millimeter wave applications. The present work shows the infrastructure developed for RF/Microwave and millimeter wave MEMS platforms, which helps the identification, evaluation and selection of design tools and fabrication foundries taking into account packaging and testing. The selected and implemented simple infrastructure models, based on surface and bulk micromachining, yield inexpensive and innovative approaches for distributed choices of MEMS operating tools. With different educational or industrial institution needs, these models may be modified for specific resource changes using a careful analyzed iteration process. The inputs of the project are evaluation selection criteria and information sources such as financial, technical, availability, accessibility, simplicity, versatility and practical considerations. The outputs of the project are the selection of different MEMS design tools or software (solid modeling, electrostatic/electromagnetic and others, compatible with existing standard RF/Microwave design tools) and different MEMS manufacturing foundries. Typical RF/Microwave and millimeter wave MEMS solutions are introduced on the platform during the evaluation and development phases of the project for the validation of realistic results and operational decision making choices. The encountered challenges during the investigation and the development steps are identified and the dynamic behavior of the infrastructure is emphasized. The inputs (resources) and the outputs (demonstrated solutions) are presented in tables and flow chart mode diagrams.

  19. Calibration and parameterization of a semi-distributed hydrological model to support sub-daily ensemble flood forecasting; a watershed in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida Bressiani, D.; Srinivasan, R.; Mendiondo, E. M.

    2013-12-01

    The use of distributed or semi-distributed models to represent the processes and dynamics of a watershed in the last few years has increased. These models are important tools to predict and forecast the hydrological responses of the watersheds, and they can subside disaster risk management and planning. However they usually have a lot of parameters, of which, due to the spatial and temporal variability of the processes, are not known, specially in developing countries; therefore a robust and sensible calibration is very important. This study conduced a sub-daily calibration and parameterization of the Soil & Water Assessment Tool (SWAT) for a 12,600 km2 watershed in southeast Brazil, and uses ensemble forecasts to evaluate if the model can be used as a tool for flood forecasting. The Piracicaba Watershed, in São Paulo State, is mainly rural, but has about 4 million of population in highly relevant urban areas, and three cities in the list of critical cities of the National Center for Natural Disasters Monitoring and Alerts. For calibration: the watershed was divided in areas with similar hydrological characteristics, for each of these areas one gauge station was chosen for calibration; this procedure was performed to evaluate the effectiveness of calibrating in fewer places, since areas with the same group of groundwater, soil, land use and slope characteristics should have similar parameters; making calibration a less time-consuming task. The sensibility analysis and calibration were performed on the software SWAT-CUP with the optimization algorithm: Sequential Uncertainly Fitting Version 2 (SUFI-2), which uses Latin hypercube sampling scheme in an iterative process. The performance of the models to evaluate the calibration and validation was done with: Nash-Sutcliffe efficiency coefficient (NSE), determination coefficient (r2), root mean square error (RMSE), and percent bias (PBIAS), with monthly average values of NSE around 0.70, r2 of 0.9, normalized RMSE of 0.01, and PBIAS of 10. Past events were analysed to evaluate the possibility of using the SWAT developed model for Piracicaba watershed as a tool for ensemble flood forecasting. For the ensemble evaluation members from the numerical model Eta were used. Eta is an atmospheric model used for research and operational purposes, with 5km resolution, and is updated twice a day (00 e 12 UTC) for a ten day horizon, with precipitation and weather estimates for each hour. The parameterized SWAT model performed overall well for ensemble flood forecasting.

  20. Uses of Agent-Based Modeling for Health Communication: the TELL ME Case Study.

    PubMed

    Barbrook-Johnson, Peter; Badham, Jennifer; Gilbert, Nigel

    2017-08-01

    Government communication is an important management tool during a public health crisis, but understanding its impact is difficult. Strategies may be adjusted in reaction to developments on the ground and it is challenging to evaluate the impact of communication separately from other crisis management activities. Agent-based modeling is a well-established research tool in social science to respond to similar challenges. However, there have been few such models in public health. We use the example of the TELL ME agent-based model to consider ways in which a non-predictive policy model can assist policy makers. This model concerns individuals' protective behaviors in response to an epidemic, and the communication that influences such behavior. Drawing on findings from stakeholder workshops and the results of the model itself, we suggest such a model can be useful: (i) as a teaching tool, (ii) to test theory, and (iii) to inform data collection. We also plot a path for development of similar models that could assist with communication planning for epidemics.

  1. A database and tool for boundary conditions for regional air quality modeling: description and evaluation

    NASA Astrophysics Data System (ADS)

    Henderson, B. H.; Akhtar, F.; Pye, H. O. T.; Napelenok, S. L.; Hutzell, W. T.

    2013-09-01

    Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available observations are too sparse vertically to provide boundary information, particularly for ozone precursors, but global simulations can be used to generate spatially and temporally varying Lateral Boundary Conditions (LBC). This study presents a public database of global simulations designed and evaluated for use as LBC for air quality models (AQMs). The database covers the contiguous United States (CONUS) for the years 2000-2010 and contains hourly varying concentrations of ozone, aerosols, and their precursors. The database is complimented by a tool for configuring the global results as inputs to regional scale models (e.g., Community Multiscale Air Quality or Comprehensive Air quality Model with extensions). This study also presents an example application based on the CONUS domain, which is evaluated against satellite retrieved ozone vertical profiles. The results show performance is largely within uncertainty estimates for the Tropospheric Emission Spectrometer (TES) with some exceptions. The major difference shows a high bias in the upper troposphere along the southern boundary in January. This publication documents the global simulation database, the tool for conversion to LBC, and the fidelity of concentrations on the boundaries. This documentation is intended to support applications that require representation of long-range transport of air pollutants.

  2. Integration of tools for the Design and Assessment of High-Performance, Highly Reliable Computing Systems (DAHPHRS), phase 1

    NASA Technical Reports Server (NTRS)

    Scheper, C.; Baker, R.; Frank, G.; Yalamanchili, S.; Gray, G.

    1992-01-01

    Systems for Space Defense Initiative (SDI) space applications typically require both high performance and very high reliability. These requirements present the systems engineer evaluating such systems with the extremely difficult problem of conducting performance and reliability trade-offs over large design spaces. A controlled development process supported by appropriate automated tools must be used to assure that the system will meet design objectives. This report describes an investigation of methods, tools, and techniques necessary to support performance and reliability modeling for SDI systems development. Models of the JPL Hypercubes, the Encore Multimax, and the C.S. Draper Lab Fault-Tolerant Parallel Processor (FTPP) parallel-computing architectures using candidate SDI weapons-to-target assignment algorithms as workloads were built and analyzed as a means of identifying the necessary system models, how the models interact, and what experiments and analyses should be performed. As a result of this effort, weaknesses in the existing methods and tools were revealed and capabilities that will be required for both individual tools and an integrated toolset were identified.

  3. Developing a Framework for Effective Network Capacity Planning

    NASA Technical Reports Server (NTRS)

    Yaprak, Ece

    2005-01-01

    As Internet traffic continues to grow exponentially, developing a clearer understanding of, and appropriately measuring, network's performance is becoming ever more critical. An important challenge faced by the Information Resources Directorate (IRD) at the Johnson Space Center in this context remains not only monitoring and maintaining a secure network, but also better understanding the capacity and future growth potential boundaries of its network. This requires capacity planning which involves modeling and simulating different network alternatives, and incorporating changes in design as technologies, components, configurations, and applications change, to determine optimal solutions in light of IRD's goals, objectives and strategies. My primary task this summer was to address this need. I evaluated network-modeling tools from OPNET Technologies Inc. and Compuware Corporation. I generated a baseline model for Building 45 using both tools by importing "real" topology/traffic information using IRD's various network management tools. I compared each tool against the other in terms of the advantages and disadvantages of both tools to accomplish IRD's goals. I also prepared step-by-step "how to design a baseline model" tutorial for both OPNET and Compuware products.

  4. Multivariate Strategies in Functional Magnetic Resonance Imaging

    ERIC Educational Resources Information Center

    Hansen, Lars Kai

    2007-01-01

    We discuss aspects of multivariate fMRI modeling, including the statistical evaluation of multivariate models and means for dimensional reduction. In a case study we analyze linear and non-linear dimensional reduction tools in the context of a "mind reading" predictive multivariate fMRI model.

  5. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    PubMed

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived commitment, influence and confidence of stakeholders in implementing policy and program decisions identified in the modelling process; and the impact of the process in terms of policy and program change. The study will generate empirical evidence on the feasibility and potential value of simulation modelling to support knowledge mobilisation and consensus building in health settings.

  6. Using the SAMR Model as a Framework for Evaluating mLearning Activities and Supporting a Transformation of Learning

    ERIC Educational Resources Information Center

    Pfaffe, Linda D.

    2017-01-01

    An examination of the perceptions of mLearning held by secondary teachers as well as mLearning best practices and professional development. Using the SAMR model (Puentedura, 2013) along with Romrell's (2014) definition of mLearning, this study evaluated mLearning activities (tools and applications) currently being used by secondary school…

  7. Freva - Freie Univ Evaluation System Framework for Scientific HPC Infrastructures in Earth System Modeling

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Schartner, T.; Grieger, J.; Kirchner, I.; Rust, H.; Cubasch, U.; Ulbrich, U.

    2017-12-01

    The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science (e.g. www-miklip.dkrz.de, cmip-eval.dkrz.de). Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.

  8. Screening tool to evaluate the vulnerability of down-gradient receptors to groundwater contaminants from uncapped landfills

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Ronald J.; Reilly, Timothy J.; Lopez, Anthony

    2015-09-15

    Highlights: • A spreadsheet-based risk screening tool for groundwater affected by landfills is presented. • Domenico solute transport equations are used to estimate downgradient contaminant concentrations. • Landfills are categorized as presenting high, moderate or low risks. • Analysis of parameter sensitivity and examples of the method’s application are given. • The method has value to regulators and those considering redeveloping closed landfills. - Abstract: A screening tool for quantifying levels of concern for contaminants detected in monitoring wells on or near landfills to down-gradient receptors (streams, wetlands and residential lots) was developed and evaluated. The tool uses Quick Domenicomore » Multi-scenario (QDM), a spreadsheet implementation of Domenico-based solute transport, to estimate concentrations of contaminants reaching receptors under steady-state conditions from a constant-strength source. Unlike most other available Domenico-based model applications, QDM calculates the time for down-gradient contaminant concentrations to approach steady state and appropriate dispersivity values, and allows for up to fifty simulations on a single spreadsheet. Sensitivity of QDM solutions to critical model parameters was quantified. The screening tool uses QDM results to categorize landfills as having high, moderate and low levels of concern, based on contaminant concentrations reaching receptors relative to regulatory concentrations. The application of this tool was demonstrated by assessing levels of concern (as defined by the New Jersey Pinelands Commission) for thirty closed, uncapped landfills in the New Jersey Pinelands National Reserve, using historic water-quality data from monitoring wells on and near landfills and hydraulic parameters from regional flow models. Twelve of these landfills are categorized as having high levels of concern, indicating a need for further assessment. This tool is not a replacement for conventional numerically-based transport model or other available Domenico-based applications, but is suitable for quickly assessing the level of concern posed by a landfill or other contaminant point source before expensive and lengthy monitoring or remediation measures are taken. In addition to quantifying the level of concern using historic groundwater-monitoring data, the tool allows for archiving model scenarios and adding refinements as new data become available.« less

  9. Forage resource evaluation system for habitat—deer: an interactive deer habitat model

    Treesearch

    Thomas A. Hanley; Donald E. Spalinger; Kenrick J. Mock; Oran L. Weaver; Grant M. Harris

    2012-01-01

    We describe a food-based system for quantitatively evaluating habitat quality for deer called the Forage Resource Evaluation System for Habitat and provide its rationale and suggestions for use. The system was developed as a tool for wildlife biologists and other natural resource managers and planners interested in evaluating habitat quality and, especially, comparing...

  10. A simplified gis-based model for large wood recruitment and connectivity in mountain basins

    NASA Astrophysics Data System (ADS)

    Franceschi, Silvia; Antonello, Andrea; Vela, Ana Lucia; Cavalli, Marco; Crema, Stefano; Comiti, Francesco; Tonon, Giustino

    2015-04-01

    During the last 50 years in the Alps the decline of the rural and forest economy and at the depopulation of the mountain areas caused the progressive abandon of the land in general and in particular of the riparian zones and the consequent increment of the vegetation extension. On one hand the wood increases the availability of organic matter and has positive effects on mountain river systems. However, during flooding events large woods that reach the stream cause the clogging of bridges with an increase of flood hazard. The approach to the evaluation of the availability of large wood during flooding events is still a challenge. There are models that simulate the propagation of the logs downstream, but the evaluation of the trees that can reach the stream is still done using simplified GIS procedures. These procedures are the base for our research which will include LiDAR derived information on vegetation to evaluate large wood recruitment extreme events. Within the last Google Summer of Code (2014) we developed a set of tools to evaluate large wood recruitment and propagation along the channel network based on a simplified methodology for monitoring and modeling large wood recruitment and transport in mountain basins implemented by Lucía et 2014. These tools are integrated in the JGrassTools project as a dedicated section in the Hydro-Geomorphology library. The section LWRecruitment contains 10 simple modules that allow the user to start from very simple information related to geomorphology, flooding areas and vegetation cover and obtain a map of the most probable critical sections on the streams. The tools cover the two main aspects related to the iteration of large wood with the rivers: the recruitment mechanisms and the propagation downstream. While the propagation tool is very simple and does not consider the hydrodynamic of the problem, the recruitment algorithms are more specific and consider the influence of hillslopes stability and the flooding extension. The modules are available for download at www.jgrasstools.org. A simple and easy to use graphical interface to run the models is available at https://github.com/moovida/STAGE/releases.

  11. Performance Evaluation of the NASA/KSC Transmission System

    NASA Technical Reports Server (NTRS)

    Christensen, Kenneth J.

    2000-01-01

    NASA-KSC currently uses three bridged 100-Mbps FDDI segments as its backbone for data traffic. The FDDI Transmission System (FTXS) connects the KSC industrial area, KSC launch complex 39 area, and the Cape Canaveral Air Force Station. The report presents a performance modeling study of the FTXS and the proposed ATM Transmission System (ATXS). The focus of the study is on performance of MPEG video transmission on these networks. Commercial modeling tools - the CACI Predictor and Comnet tools - were used. In addition, custom software tools were developed to characterize conversation pairs in Sniffer trace (capture) files to use as input to these tools. A baseline study of both non-launch and launch day data traffic on the FTXS is presented. MPEG-1 and MPEG-2 video traffic was characterized and the shaping of it evaluated. It is shown that the characteristics of a video stream has a direct effect on its performance in a network. It is also shown that shaping of video streams is necessary to prevent overflow losses and resulting poor video quality. The developed models can be used to predict when the existing FTXS will 'run out of room' and for optimizing the parameters of ATM links used for transmission of MPEG video. Future work with these models can provide useful input and validation to set-top box projects within the Advanced Networks Development group in NASA-KSC Development Engineering.

  12. Evaluation of Full Reynolds Stress Turbulence Models in FUN3D

    NASA Technical Reports Server (NTRS)

    Dudek, Julianne C.; Carlson, Jan-Renee

    2017-01-01

    Full seven-equation Reynolds stress turbulence models are a relatively new and promising tool for todays aerospace technology challenges. This paper uses two stress-omega full Reynolds stress models to evaluate challenging flows including shock-wave boundary layer interactions, separation and mixing layers. The Wilcox and the SSGLRR full second-moment Reynolds stress models are evaluated for four problems: a transonic two-dimensional diffuser, a supersonic axisymmetric compression corner, a compressible planar shear layer, and a subsonic axisymmetric jet. Simulation results are compared with experimental data and results using the more commonly used Spalart-Allmaras (SA) one-equation and the Menter Shear Stress Transport (SST) two-equation models.

  13. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  14. Evaluation of NASA's end-to-end data systems using DSDS+

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Davenport, William; Message, Philip

    1994-01-01

    The Data Systems Dynamic Simulator (DSDS+) is a software tool being developed by the authors to evaluate candidate architectures for NASA's end-to-end data systems. Via modeling and simulation, we are able to quickly predict the performance characteristics of each architecture, to evaluate 'what-if' scenarios, and to perform sensitivity analyses. As such, we are using modeling and simulation to help NASA select the optimal system configuration, and to quantify the performance characteristics of this system prior to its delivery. This paper is divided into the following six sections: (1) The role of modeling and simulation in the systems engineering process. In this section, we briefly describe the different types of results obtained by modeling each phase of the systems engineering life cycle, from concept definition through operations and maintenance; (2) Recent applications of DSDS+. In this section, we describe ongoing applications of DSDS+ in support of the Earth Observing System (EOS), and we present some of the simulation results generated of candidate system designs. So far, we have modeled individual EOS subsystems (e.g. the Solid State Recorders used onboard the spacecraft), and we have also developed an integrated model of the EOS end-to-end data processing and data communications systems (from the payloads onboard to the principle investigator facilities on the ground); (3) Overview of DSDS+. In this section we define what a discrete-event model is, and how it works. The discussion is presented relative to the DSDS+ simulation tool that we have developed, including it's run-time optimization algorithms that enables DSDS+ to execute substantially faster than comparable discrete-event simulation tools; (4) Summary. In this section, we summarize our findings and 'lessons learned' during the development and application of DSDS+ to model NASA's data systems; (5) Further Information; and (6) Acknowledgements.

  15. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    PubMed

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society for Leukocyte Biology.

  16. Transforming data into usable knowledge: the CIRC experience

    NASA Astrophysics Data System (ADS)

    Mote, P.; Lach, D.; Hartmann, H.; Abatzoglou, J. T.; Stevenson, J.

    2017-12-01

    NOAA's northwest RISA, the Climate Impacts Research Consortium, emphasizes the transformation of data into usable knowledge. This effort involves physical scientists (e.g., Abatzoglou) building web-based tools with climate and hydrologic data and model output, a team performing data mining to link crop loss claims to droughts, social scientists (eg., Lach, Hartmann) evaluating the effectiveness of such tools at communicating with end users, and two-way engagement with a wide variety of audiences who are interested in using and improving the tools. Unusual in this effort is the seamless integration across timescales past, present, and future; data mining; and the level of effort in evaluating the tools. We provide examples of agriculturally relevant climate variables (e.g. growing degree days, day of first fall freeze) and describe the iterative process of incorporating user feedback.

  17. Operation Reliability Assessment for Cutting Tools by Applying a Proportional Covariate Model to Condition Monitoring Information

    PubMed Central

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-01-01

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools. PMID:23201980

  18. Leveraging Observation Tools for Instructional Improvement: Exploring Variability in Uptake of Ambitious Instructional Practices

    ERIC Educational Resources Information Center

    Cohen, Julie; Schuldt, Lorien Chambers; Brown, Lindsay; Grossman, Pamela

    2016-01-01

    Background/Context: Current efforts to build rigorous teacher evaluation systems has increased interest in standardized classroom observation tools as reliable measures for assessing teaching. However, many argue these instruments can also be used to effect change in classroom practice. This study investigates a model of professional development…

  19. Analysis instruments for the performance of Advanced Practice Nursing.

    PubMed

    Sevilla-Guerra, Sonia; Zabalegui, Adelaida

    2017-11-29

    Advanced Practice Nursing has been a reality in the international context for several decades and recently new nursing profiles have been developed in Spain as well that follow this model. The consolidation of these advanced practice roles has also led to of the creation of tools that attempt to define and evaluate their functions. This study aims to identify and explore the existing instruments that enable the domains of Advanced Practice Nursing to be defined. A review of existing international questionnaires and instruments was undertaken, including an analysis of the design process, the domains/dimensions defined, the main results and an exploration of clinimetric properties. Seven studies were analysed but not all proved to be valid, stable or reliable tools. One included tool was able to differentiate between the functions of the general nurse and the advanced practice nurse by the level of activities undertaken within the five domains described. These tools are necessary to evaluate the scope of advanced practice in new nursing roles that correspond to other international models of competencies and practice domains. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  20. INNOVATIVE METHODS FOR EMISSION-INVENTORY DEVELOPMENT AND EVALUATION: WORKSHOP SUMMARY

    EPA Science Inventory

    Emission inventories are an essential tool for evaluating, managing, and regulating air pollution. Refinements and innovations in instruments that measure air pollutants, models that calculate emissions as well as techniques for data management and uncertainty assessment are nee...

  1. Comparison and evaluation of model structures for the simulation of pollution fluxes in a tile-drained river basin.

    PubMed

    Hoang, Linh; van Griensven, Ann; van der Keur, Peter; Refsgaard, Jens Christian; Troldborg, Lars; Nilsson, Bertel; Mynett, Arthur

    2014-01-01

    The European Union Water Framework Directive requires an integrated pollution prevention plan at the river basin level. Hydrological river basin modeling tools are therefore promising tools to support the quantification of pollution originating from different sources. A limited number of studies have reported on the use of these models to predict pollution fluxes in tile-drained basins. This study focused on evaluating different modeling tools and modeling concepts to quantify the flow and nitrate fluxes in the Odense River basin using DAISY-MIKE SHE (DMS) and the Soil and Water Assessment Tool (SWAT). The results show that SWAT accurately predicted flow for daily and monthly time steps, whereas simulation of nitrate fluxes were more accurate at a monthly time step. In comparison to the DMS model, which takes into account the uncertainty of soil hydraulic and slurry parameters, SWAT results for flow and nitrate fit well within the range of DMS simulated values in high-flow periods but were slightly lower in low-flow periods. Despite the similarities of simulated flow and nitrate fluxes at the basin outlet, the two models predicted very different separations into flow components (overland flow, tile drainage, and groundwater flow) as well as nitrate fluxes from flow components. It was concluded that the assessment on which the model provides a better representation of the reality in terms of flow paths should not only be based on standard statistical metrics for the entire river basin but also needs to consider additional data, field experiments, and opinions of field experts. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  2. Evaluation of modeling as a tool to determine the potential impacts related to drilling wastes in the Brazilian offshore.

    PubMed

    Pivel, María Alejandra Gómez; Dal Sasso Freitas, Carla Maria

    2010-08-01

    Numerical models that predict the fate of drilling discharges at sea constitute a valuable tool for both the oil industry and regulatory agencies. In order to provide reliable estimates, models must be validated through the comparison of predictions with field or laboratory observations. In this paper, we used the Offshore Operators Committee Model to simulate the discharges from two wells drilled at Campos Basin, offshore SE Brazil, and compared the results with field observations obtained 3 months after drilling. The comparison showed that the model provided reasonable predictions, considering that data about currents were reconstructed and theoretical data were used to characterize the classes of solids. The model proved to be a valuable tool to determine the degree of potential impact associated to drilling activities. However, since the accuracy of the model is directly dependent on the quality of input data, different possible scenarios should be considered when used for forecast modeling.

  3. The Processing of Airspace Concept Evaluations Using FASTE-CNS as a Pre- or Post-Simulation CNS Analysis Tool

    NASA Technical Reports Server (NTRS)

    Mainger, Steve

    2004-01-01

    As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.

  4. A corpus of full-text journal articles is a robust evaluation tool for revealing differences in performance of biomedical natural language processing tools

    PubMed Central

    2012-01-01

    Background We introduce the linguistic annotation of a corpus of 97 full-text biomedical publications, known as the Colorado Richly Annotated Full Text (CRAFT) corpus. We further assess the performance of existing tools for performing sentence splitting, tokenization, syntactic parsing, and named entity recognition on this corpus. Results Many biomedical natural language processing systems demonstrated large differences between their previously published results and their performance on the CRAFT corpus when tested with the publicly available models or rule sets. Trainable systems differed widely with respect to their ability to build high-performing models based on this data. Conclusions The finding that some systems were able to train high-performing models based on this corpus is additional evidence, beyond high inter-annotator agreement, that the quality of the CRAFT corpus is high. The overall poor performance of various systems indicates that considerable work needs to be done to enable natural language processing systems to work well when the input is full-text journal articles. The CRAFT corpus provides a valuable resource to the biomedical natural language processing community for evaluation and training of new models for biomedical full text publications. PMID:22901054

  5. Advanced engineering environment collaboration project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamph, Jane Ann; Pomplun, Alan R.; Kiba, Grant W.

    2008-12-01

    The Advanced Engineering Environment (AEE) is a model for an engineering design and communications system that will enhance project collaboration throughout the nuclear weapons complex (NWC). Sandia National Laboratories and Parametric Technology Corporation (PTC) worked together on a prototype project to evaluate the suitability of a portion of PTC's Windchill 9.0 suite of data management, design and collaboration tools as the basis for an AEE. The AEE project team implemented Windchill 9.0 development servers in both classified and unclassified domains and used them to test and evaluate the Windchill tool suite relative to the needs of the NWC using weaponsmore » project use cases. A primary deliverable was the development of a new real time collaborative desktop design and engineering process using PDMLink (data management tool), Pro/Engineer (mechanical computer aided design tool) and ProductView Lite (visualization tool). Additional project activities included evaluations of PTC's electrical computer aided design, visualization, and engineering calculations applications. This report documents the AEE project work to share information and lessons learned with other NWC sites. It also provides PTC with recommendations for improving their products for NWC applications.« less

  6. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  7. Evaluation of a stepwise, multi-objective, multi-variable parameter optimization method for the APEX model

    USDA-ARS?s Scientific Manuscript database

    Hydrologic models are essential tools for environmental assessment of agricultural non-point source pollution. The automatic calibration of hydrologic models, though efficient, demands significant computational power, which can limit its application. The study objective was to investigate a cost e...

  8. NASA Data for Water Resources Applications

    NASA Technical Reports Server (NTRS)

    Toll, David; Houser, Paul; Arsenault, Kristi; Entin, Jared

    2004-01-01

    Water Management Applications is one of twelve elements in the Earth Science Enterprise National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of: 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies including the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. This includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being eliminated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems WAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification validation. Water Management Applications is one of twelve elements in the Earth Science Enterprise s National Applications Program. NASA Goddard Space Flight Center is supporting the Applications Program through partnering with other organizations to use NASA project results, such as from satellite instruments and Earth system models to enhance the organizations critical needs. The focus thus far has been: 1) estimating water storage including snowpack and soil moisture, 2) modeling and predicting water fluxes such as evapotranspiration (ET), precipitation and river runoff, and 3) remote sensing of water quality, including both point source (e.g., turbidity and productivity) and non-point source (e.g., land cover conversion such as forest to agriculture yielding higher nutrient runoff). The objectives of the partnering cover three steps of 1) Evaluation, 2) Verification and Validation, and 3) Benchmark Report. We are working with the U.S. federal agencies the Environmental Protection Agency (EPA), the Bureau of Reclamation (USBR) and the Department of Agriculture (USDA). We are using several of their Decision Support Systems (DSS) tools. T us includes the DSS support tools BASINS used by EPA, Riverware and AWARDS ET ToolBox by USBR and SWAT by USDA and EPA. Regional application sites using NASA data across the US. are currently being evaluated for the DSS tools. The current NASA data emphasized thus far are from the Land Data Assimilation Systems (LDAS) and MODIS satellite products. We are currently in the first two steps of evaluation and verification and validation.

  9. Designers' unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.

    1992-01-01

    The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  10. A method to evaluate hydraulic fracture using proppant detection.

    PubMed

    Liu, Juntao; Zhang, Feng; Gardner, Robin P; Hou, Guojing; Zhang, Quanying; Li, Hu

    2015-11-01

    Accurate determination of the proppant placement and propped fracture height are important for evaluating and optimizing stimulation strategies. A technology using non-radioactive proppant and a pulsed neutron gamma energy spectra logging tool to determine the placement and height of propped fractures is proposed. Gd2O3 was incorporated into ceramic proppant and a Monte Carlo method was utilized to build the logging tools and formation models. Characteristic responses of the recorded information of different logging tools to fracture widths, proppant concentrations and influencing factors were studied. The results show that Gd capture gamma rays can be used to evaluate propped fractures and it has higher sensitivity to the change of fracture width and traceable proppant content compared with the exiting non-radioactive proppant evaluation techniques and only an after-fracture measurement is needed for the new method; The changes in gas saturation and borehole size have a great impact on determining propped fractures when compensated neutron and pulsed neutron capture tool are used. A field example is presented to validate the application of the new technique. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Development and Validation of the Controller Acceptance Rating Scale (CARS): Results of Empirical Research

    NASA Technical Reports Server (NTRS)

    Lee, Katharine K.; Kerns, Karol; Bone, Randall

    2001-01-01

    The measurement of operational acceptability is important for the development, implementation, and evolution of air traffic management decision support tools. The Controller Acceptance Rating Scale was developed at NASA Ames Research Center for the development and evaluation of the Passive Final Approach Spacing Tool. CARS was modeled after a well-known pilot evaluation rating instrument, the Cooper-Harper Scale, and has since been used in the evaluation of the User Request Evaluation Tool, developed by MITRE's Center for Advanced Aviation System Development. In this paper, we provide a discussion of the development of CARS and an analysis of the empirical data collected with CARS to examine construct validity. Results of intraclass correlations indicated statistically significant reliability for the CARS. From the subjective workload data that were collected in conjunction with the CARS, it appears that the expected set of workload attributes was correlated with the CARS. As expected, the analysis also showed that CARS was a sensitive indicator of the impact of decision support tools on controller operations. Suggestions for future CARS development and its improvement are also provided.

  12. A database and tool for boundary conditions for regional air quality modeling: description and evaluation

    NASA Astrophysics Data System (ADS)

    Henderson, B. H.; Akhtar, F.; Pye, H. O. T.; Napelenok, S. L.; Hutzell, W. T.

    2014-02-01

    Transported air pollutants receive increasing attention as regulations tighten and global concentrations increase. The need to represent international transport in regional air quality assessments requires improved representation of boundary concentrations. Currently available observations are too sparse vertically to provide boundary information, particularly for ozone precursors, but global simulations can be used to generate spatially and temporally varying lateral boundary conditions (LBC). This study presents a public database of global simulations designed and evaluated for use as LBC for air quality models (AQMs). The database covers the contiguous United States (CONUS) for the years 2001-2010 and contains hourly varying concentrations of ozone, aerosols, and their precursors. The database is complemented by a tool for configuring the global results as inputs to regional scale models (e.g., Community Multiscale Air Quality or Comprehensive Air quality Model with extensions). This study also presents an example application based on the CONUS domain, which is evaluated against satellite retrieved ozone and carbon monoxide vertical profiles. The results show performance is largely within uncertainty estimates for ozone from the Ozone Monitoring Instrument and carbon monoxide from the Measurements Of Pollution In The Troposphere (MOPITT), but there were some notable biases compared with Tropospheric Emission Spectrometer (TES) ozone. Compared with TES, our ozone predictions are high-biased in the upper troposphere, particularly in the south during January. This publication documents the global simulation database, the tool for conversion to LBC, and the evaluation of concentrations on the boundaries. This documentation is intended to support applications that require representation of long-range transport of air pollutants.

  13. An Ecosystem Service Evaluation Tool to Support Ridge-to-Reef Management and Conservation in Hawaii

    NASA Astrophysics Data System (ADS)

    Oleson, K.; Callender, T.; Delevaux, J. M. S.; Falinski, K. A.; Htun, H.; Jin, G.

    2014-12-01

    Faced with increasing anthropogenic stressors and diverse stakeholders, local managers are adopting a ridge-to-reef and multi-objective management approach to restore declining coral reef health state. An ecosystem services framework, which integrates ecological indicators and stakeholder values, can foster more applied and integrated research, data collection, and modeling, and thus better inform the decision-making process and realize decision outcomes grounded in stakeholders' values. Here, we describe a research program that (i) leverages remotely sensed and empirical data to build an ecosystem services-based decision-support tool geared towards ridge-to-reef management; and (ii) applies it as part of a structured, value-based decision-making process to inform management in west Maui, a NOAA coral reef conservation priority site. The tool links terrestrial and marine biophysical models in a spatially explicit manner to quantify and map changes in ecosystem services delivery resulting from management actions, projected climate change impacts, and adaptive responses. We couple model outputs with localized valuation studies to translate ecosystem service outcomes into benefits and their associated socio-cultural and/or economic values. Managers can use this tool to run scenarios during their deliberations to evaluate trade-offs, cost-effectiveness, and equity implications of proposed policies. Ultimately, this research program aims at improving the effectiveness, efficiency, and equity outcomes of ecosystem-based management. This presentation will describe our approach, summarize initial results from the terrestrial modeling and economic valuations for west Maui, and highlight how this decision support tool benefits managers in west Maui.

  14. Development and process evaluation of the participatory and action-oriented empowerment model facilitated by occupational health nurses for workplace health promotion in small and medium-sized enterprises.

    PubMed

    Nishikido, Noriko; Matsuda, Kazumi; Fukuda, Eiko; Motoki, Chiharu; Tsutaki, Miho; Kawakami, Yuko; Yuasa, Akiko; Iijima, Miyoko; Tanaka, Mika; Hirata, Mamoru; Hojoh, Minoru; Ikeda, Tomoko; Maeda, Kazutoshi; Miyoshi, Yukari; Arai, Sumiko; Mitsuhashi, Hiroyuki

    2007-01-01

    The objective of this study is to develop an available empowerment model for workplace health promotion (WHP) in small and medium-sized enterprises (SMEs) and to evaluate its applicability and feasibility. Semi-structured interviews with employers and workers in SMEs were conducted to assess their actual requirements for support. The structure of our new empowerment model was discussed and established through several rounds of focus group meetings with occupational safety and health researchers and practitioners on the basis of results of our interviews. We developed a new participatory and action-oriented empowerment model based on needs for support of employers and workers in SMEs. This new model consists of three originally developed tools: an action checklist, an information guidebook, and a book of good practices. As the facilitators, occupational health nurses (OHNs) from health insurance associations were trained to empower employers and workers using these tools. Approximately 80 SMEs (with less than 300 employees) were invited to participate in the model project. With these tools and continued empowerment by OHNs, employers and workers were able to smoothly work on WHP. This newly developed participatory and action-oriented empowerment model that was facilitated by trained OHNs appears to be both applicable and feasible for WHP in SMEs in Japan.

  15. Using a logic model to relate the strategic to the tactical in program planning and evaluation: an illustration based on social norms interventions.

    PubMed

    Keller, Adrienne; Bauerle, Jennifer A

    2009-01-01

    Logic models are a ubiquitous tool for specifying the tactics--including implementation and evaluation--of interventions in the public health, health and social behaviors arenas. Similarly, social norms interventions are a common strategy, particularly in college settings, to address hazardous drinking and other dangerous or asocial behaviors. This paper illustrates an extension of logic models to include strategic as well as tactical components, using a specific example developed for social norms interventions. Placing the evaluation of projects within the context of this kind of logic model addresses issues related to the lack of a research design to evaluate effectiveness.

  16. Predicting tree species presence and basal area in Utah: A comparison of stochastic gradient boosting, generalized additive models, and tree-based methods

    USGS Publications Warehouse

    Moisen, Gretchen G.; Freeman, E.A.; Blackard, J.A.; Frescino, T.S.; Zimmermann, N.E.; Edwards, T.C.

    2006-01-01

    Many efforts are underway to produce broad-scale forest attribute maps by modelling forest class and structure variables collected in forest inventories as functions of satellite-based and biophysical information. Typically, variants of classification and regression trees implemented in Rulequest's?? See5 and Cubist (for binary and continuous responses, respectively) are the tools of choice in many of these applications. These tools are widely used in large remote sensing applications, but are not easily interpretable, do not have ties with survey estimation methods, and use proprietary unpublished algorithms. Consequently, three alternative modelling techniques were compared for mapping presence and basal area of 13 species located in the mountain ranges of Utah, USA. The modelling techniques compared included the widely used See5/Cubist, generalized additive models (GAMs), and stochastic gradient boosting (SGB). Model performance was evaluated using independent test data sets. Evaluation criteria for mapping species presence included specificity, sensitivity, Kappa, and area under the curve (AUC). Evaluation criteria for the continuous basal area variables included correlation and relative mean squared error. For predicting species presence (setting thresholds to maximize Kappa), SGB had higher values for the majority of the species for specificity and Kappa, while GAMs had higher values for the majority of the species for sensitivity. In evaluating resultant AUC values, GAM and/or SGB models had significantly better results than the See5 models where significant differences could be detected between models. For nine out of 13 species, basal area prediction results for all modelling techniques were poor (correlations less than 0.5 and relative mean squared errors greater than 0.8), but SGB provided the most stable predictions in these instances. SGB and Cubist performed equally well for modelling basal area for three species with moderate prediction success, while all three modelling tools produced comparably good predictions (correlation of 0.68 and relative mean squared error of 0.56) for one species. ?? 2006 Elsevier B.V. All rights reserved.

  17. Computational Molecular Modeling for Evaluating the Toxicity of Environmental Chemicals: Prioritizing Bioassay Requirements

    EPA Science Inventory

    This commentary provides an overview of the challenges that arise from applying molecular modeling tools developed and commonly used for pharmaceutical discovery to the problem of predicting the potential toxicities of environmental chemicals.

  18. Estimating Sobol Sensitivity Indices Using Correlations

    EPA Science Inventory

    Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on...

  19. A comparative assessment of tools for ecosystem services quantification and valuation

    USGS Publications Warehouse

    Bagstad, Kenneth J.; Semmens, Darius; Waage, Sissel; Winthrop, Robert

    2013-01-01

    To enter widespread use, ecosystem service assessments need to be quantifiable, replicable, credible, flexible, and affordable. With recent growth in the field of ecosystem services, a variety of decision-support tools has emerged to support more systematic ecosystem services assessment. Despite the growing complexity of the tool landscape, thorough reviews of tools for identifying, assessing, modeling and in some cases monetarily valuing ecosystem services have generally been lacking. In this study, we describe 17 ecosystem services tools and rate their performance against eight evaluative criteria that gauge their readiness for widespread application in public- and private-sector decision making. We describe each of the tools′ intended uses, services modeled, analytical approaches, data requirements, and outputs, as well time requirements to run seven tools in a first comparative concurrent application of multiple tools to a common location – the San Pedro River watershed in southeast Arizona, USA, and northern Sonora, Mexico. Based on this work, we offer conclusions about these tools′ current ‘readiness’ for widespread application within both public- and private-sector decision making processes. Finally, we describe potential pathways forward to reduce the resource requirements for running ecosystem services models, which are essential to facilitate their more widespread use in environmental decision making.

  20. Gear Spline Coupling Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Yi; Errichello, Robert

    2013-08-29

    An analytical model is developed to evaluate the design of a spline coupling. For a given torque and shaft misalignment, the model calculates the number of teeth in contact, tooth loads, stiffnesses, stresses, and safety factors. The analytic model provides essential spline coupling design and modeling information and could be easily integrated into gearbox design and simulation tools.

  1. Evaluating the APEX model for simulating streamflow and water quality on ten agricultural watersheds in the U.S.

    USDA-ARS?s Scientific Manuscript database

    Simulation models are increasingly used to assess water quality constituent losses from agricultural systems. Mis-use often gives irrelevant or erroneous answers. The Agricultural Policy Environmental Extender (APEX) model is emerging as one of the premier modeling tools for fields, farms, and agr...

  2. Teaching Web 2.0 technologies using Web 2.0 technologies.

    PubMed

    Rethlefsen, Melissa L; Piorun, Mary; Prince, J Dale

    2009-10-01

    The research evaluated participant satisfaction with the content and format of the "Web 2.0 101: Introduction to Second Generation Web Tools" course and measured the impact of the course on participants' self-evaluated knowledge of Web 2.0 tools. The "Web 2.0 101" online course was based loosely on the Learning 2.0 model. Content was provided through a course blog and covered a wide range of Web 2.0 tools. All Medical Library Association members were invited to participate. Participants were asked to complete a post-course survey. Respondents who completed the entire course or who completed part of the course self-evaluated their knowledge of nine social software tools and concepts prior to and after the course using a Likert scale. Additional qualitative information about course strengths and weaknesses was also gathered. Respondents' self-ratings showed a significant change in perceived knowledge for each tool, using a matched pair Wilcoxon signed rank analysis (P<0.0001 for each tool/concept). Overall satisfaction with the course appeared high. Hands-on exercises were the most frequently identified strength of the course; the length and time-consuming nature of the course were considered weaknesses by some. Learning 2.0-style courses, though demanding time and self-motivation from participants, can increase knowledge of Web 2.0 tools.

  3. Assessment of uncertainty in the numerical simulation of solar irradiance over inclined PV panels: New algorithms using measurements and modeling tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Yu; Sengupta, Manajit; Dooraghi, Mike

    Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less

  4. Assessment of uncertainty in the numerical simulation of solar irradiance over inclined PV panels: New algorithms using measurements and modeling tools

    DOE PAGES

    Xie, Yu; Sengupta, Manajit; Dooraghi, Mike

    2018-03-20

    Development of accurate transposition models to simulate plane-of-array (POA) irradiance from horizontal measurements or simulations is a complex process mainly because of the anisotropic distribution of diffuse solar radiation in the atmosphere. The limited availability of reliable POA measurements at large temporal and spatial scales leads to difficulties in the comprehensive evaluation of transposition models. This paper proposes new algorithms to assess the uncertainty of transposition models using both surface-based observations and modeling tools. We reviewed the analytical derivation of POA irradiance and the approximation of isotropic diffuse radiation that simplifies the computation. Two transposition models are evaluated against themore » computation by the rigorous analytical solution. We proposed a new algorithm to evaluate transposition models using the clear-sky measurements at the National Renewable Energy Laboratory's (NREL's) Solar Radiation Research Laboratory (SRRL) and a radiative transfer model that integrates diffuse radiances of various sky-viewing angles. We found that the radiative transfer model and a transposition model based on empirical regressions are superior to the isotropic models when compared to measurements. We further compared the radiative transfer model to the transposition models under an extensive range of idealized conditions. Our results suggest that the empirical transposition model has slightly higher cloudy-sky POA irradiance than the radiative transfer model, but performs better than the isotropic models under clear-sky conditions. Significantly smaller POA irradiances computed by the transposition models are observed when the photovoltaics (PV) panel deviates from the azimuthal direction of the sun. The new algorithms developed in the current study have opened the door to a more comprehensive evaluation of transposition models for various atmospheric conditions and solar and PV orientations.« less

  5. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, Margaret T.; Wang, Jun; Pierce, Marlon E.; Yoder, Mark R.; Parker, Jay W.; Burl, Michael C.; Stough, Timothy M.; Granat, Robert A.; Donnellan, Andrea; Rundle, John B.; Ma, Yu; Bawden, Gerald W.; Yuen, Karen

    2015-08-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). This in turn is delivered through standards-compliant web services for desktop and hand-held devices.

  6. Rethinking Models of Professional Learning as Tools: A Conceptual Analysis to Inform Research and Practice

    ERIC Educational Resources Information Center

    Boylan, Mark; Coldwell, Mike; Maxwell, Bronwen; Jordan, Julie

    2018-01-01

    One approach to designing, researching or evaluating professional learning experiences is to use models of learning processes. Here we analyse and critique five significant contemporary analytical models: three variations on path models, proposed by Guskey, by Desimone and by Clarke and Hollingsworth; a model using a systemic conceptualisation of…

  7. Evaluation of Supplemental Nutrition Assistance Program Education: Application of Behavioral Theory and Survey Validation

    ERIC Educational Resources Information Center

    Wyker, Brett A.; Jordan, Patricia; Quigley, Danielle L.

    2012-01-01

    Objective: Application of the Transtheoretical Model (TTM) to Supplemental Nutrition Assistance Program Education (SNAP-Ed) evaluation and development and validation of an evaluation tool used to measure TTM constructs is described. Methods: Surveys were collected from parents of children receiving food at Summer Food Service Program sites prior…

  8. Competency-Based Evaluation in Higher Education--Design and Use of Competence Rubrics by University Educators

    ERIC Educational Resources Information Center

    Velasco-Martínez, Leticia-Concepción; Tójar-Hurtado, Juan-Carlos

    2018-01-01

    Competency-based learning requires making changes in the higher education model in response to current socio-educational demands. Rubrics are an innovative educational tool for competence evaluation, for both students and educators. Ever since arriving at the university systems, the application of rubrics in evaluation programs has grown…

  9. Evaluating Learning in the 21st Century: A Digital Age Learning Matrix

    ERIC Educational Resources Information Center

    Starkey, Louise

    2011-01-01

    If the purpose of secondary schooling is to educate the upcoming generation to become active participants in society, evaluation of teaching and learning in the information-rich digital age should be underpinned by relevant theories and models. This article describes an evaluation tool developed using emerging ideas about knowledge creation and…

  10. Teacher Acceptance of Evaluation through Productivity Management.

    ERIC Educational Resources Information Center

    Garrett, A. Warren

    This paper describes the purposes of educational evaluation with particular emphasis on school improvement. The evaluation process used was based upon the analytical tools developed and tested by the Educational Productivity Council (EPC) of the University of Texas. An early form of this model and reporting system was introduced to a group of 68…

  11. Evaluation of in silico tools to predict the skin sensitization potential of chemicals.

    PubMed

    Verheyen, G R; Braeken, E; Van Deun, K; Van Miert, S

    2017-01-01

    Public domain and commercial in silico tools were compared for their performance in predicting the skin sensitization potential of chemicals. The packages were either statistical based (Vega, CASE Ultra) or rule based (OECD Toolbox, Toxtree, Derek Nexus). In practice, several of these in silico tools are used in gap filling and read-across, but here their use was limited to make predictions based on presence/absence of structural features associated to sensitization. The top 400 ranking substances of the ATSDR 2011 Priority List of Hazardous Substances were selected as a starting point. Experimental information was identified for 160 chemically diverse substances (82 positive and 78 negative). The prediction for skin sensitization potential was compared with the experimental data. Rule-based tools perform slightly better, with accuracies ranging from 0.6 (OECD Toolbox) to 0.78 (Derek Nexus), compared with statistical tools that had accuracies ranging from 0.48 (Vega) to 0.73 (CASE Ultra - LLNA weak model). Combining models increased the performance, with positive and negative predictive values up to 80% and 84%, respectively. However, the number of substances that were predicted positive or negative for skin sensitization in both models was low. Adding more substances to the dataset will increase the confidence in the conclusions reached. The insights obtained in this evaluation are incorporated in a web database www.asopus.weebly.com that provides a potential end user context for the scope and performance of different in silico tools with respect to a common dataset of curated skin sensitization data.

  12. A Regional Climate Model Evaluation System based on contemporary Satellite and other Observations for Assessing Regional Climate Model Fidelity

    NASA Astrophysics Data System (ADS)

    Waliser, D. E.; Kim, J.; Mattman, C.; Goodale, C.; Hart, A.; Zimdars, P.; Lean, P.

    2011-12-01

    Evaluation of climate models against observations is an essential part of assessing the impact of climate variations and change on regionally important sectors and improving climate models. Regional climate models (RCMs) are of a particular concern. RCMs provide fine-scale climate needed by the assessment community via downscaling global climate model projections such as those contributing to the Coupled Model Intercomparison Project (CMIP) that form one aspect of the quantitative basis of the IPCC Assessment Reports. The lack of reliable fine-resolution observational data and formal tools and metrics has represented a challenge in evaluating RCMs. Recent satellite observations are particularly useful as they provide a wealth of information and constraints on many different processes within the climate system. Due to their large volume and the difficulties associated with accessing and using contemporary observations, however, these datasets have been generally underutilized in model evaluation studies. Recognizing this problem, NASA JPL and UCLA have developed the Regional Climate Model Evaluation System (RCMES) to help make satellite observations, in conjunction with in-situ and reanalysis datasets, more readily accessible to the regional modeling community. The system includes a central database (Regional Climate Model Evaluation Database: RCMED) to store multiple datasets in a common format and codes for calculating and plotting statistical metrics to assess model performance (Regional Climate Model Evaluation Tool: RCMET). This allows the time taken to compare model data with satellite observations to be reduced from weeks to days. RCMES is a component of the recent ExArch project, an international effort for facilitating the archive and access of massive amounts data for users using cloud-based infrastructure, in this case as applied to the study of climate and climate change. This presentation will describe RCMES and demonstrate its utility using examples from RCMs applied to the southwest US as well as to Africa based on output from the CORDEX activity. Application of RCMES to the evaluation of multi-RCM hindcast for CORDEX-Africa will be presented in a companion paper in A41.

  13. Evaluation of chiller modeling approaches and their usability for fault detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sreedharan, Priya

    Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Several factors must be considered in model evaluation, including accuracy, training data requirements, calibration effort, generality, and computational requirements. All modeling approaches fall somewhere between pure first-principles models, and empirical models. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression air conditioning units, which are commonly known as chillers. Three different models were studied: two are based on first-principles and the third is empirical in nature. The first-principles models are themore » Gordon and Ng Universal Chiller model (2nd generation), and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles. The DOE-2 chiller model as implemented in CoolTools{trademark} was selected for the empirical category. The models were compared in terms of their ability to reproduce the observed performance of an older chiller operating in a commercial building, and a newer chiller in a laboratory. The DOE-2 and Gordon-Ng models were calibrated by linear regression, while a direct-search method was used to calibrate the Toolkit model. The ''CoolTools'' package contains a library of calibrated DOE-2 curves for a variety of different chillers, and was used to calibrate the building chiller to the DOE-2 model. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.« less

  14. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  15. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  16. FRED (a Framework for Reconstructing Epidemic Dynamics): an open-source software system for modeling infectious diseases and control strategies using census-based populations.

    PubMed

    Grefenstette, John J; Brown, Shawn T; Rosenfeld, Roni; DePasse, Jay; Stone, Nathan T B; Cooley, Phillip C; Wheaton, William D; Fyshe, Alona; Galloway, David D; Sriram, Anuroop; Guclu, Hasan; Abraham, Thomas; Burke, Donald S

    2013-10-08

    Mathematical and computational models provide valuable tools that help public health planners to evaluate competing health interventions, especially for novel circumstances that cannot be examined through observational or controlled studies, such as pandemic influenza. The spread of diseases like influenza depends on the mixing patterns within the population, and these mixing patterns depend in part on local factors including the spatial distribution and age structure of the population, the distribution of size and composition of households, employment status and commuting patterns of adults, and the size and age structure of schools. Finally, public health planners must take into account the health behavior patterns of the population, patterns that often vary according to socioeconomic factors such as race, household income, and education levels. FRED (a Framework for Reconstructing Epidemic Dynamics) is a freely available open-source agent-based modeling system based closely on models used in previously published studies of pandemic influenza. This version of FRED uses open-access census-based synthetic populations that capture the demographic and geographic heterogeneities of the population, including realistic household, school, and workplace social networks. FRED epidemic models are currently available for every state and county in the United States, and for selected international locations. State and county public health planners can use FRED to explore the effects of possible influenza epidemics in specific geographic regions of interest and to help evaluate the effect of interventions such as vaccination programs and school closure policies. FRED is available under a free open source license in order to contribute to the development of better modeling tools and to encourage open discussion of modeling tools being used to evaluate public health policies. We also welcome participation by other researchers in the further development of FRED.

  17. A Hemispheric Version of the Community Multiscale Air Quality (CMAQ) Modeling System

    EPA Science Inventory

    This invited presentation will be given at the 4th Biannual Western Modeling Workshop in the Plenary session on Global model development, evaluation, and new source attribution tools. We describe the development and application of the hemispheric version of the CMAQ to examine th...

  18. Simulation of Climate Change Impacts on Wheat-Fallow Cropping Systems

    USDA-ARS?s Scientific Manuscript database

    Agricultural system simulation models are predictive tools for assessing climate change impacts on crop production. In this study, RZWQM2 that contains the DSSAT 4.0-CERES model was evaluated for simulating climate change impacts on wheat growth. The model was calibrated and validated using data fro...

  19. Impact of APEX parameterization and soil data on runoff, sediment, and nutrients transport assessment

    USDA-ARS?s Scientific Manuscript database

    Hydrological models have become essential tools for environmental assessments. This study’s objective was to evaluate a best professional judgment (BPJ) parameterization of the Agricultural Policy and Environmental eXtender (APEX) model with soil-survey data against the calibrated model with either ...

  20. A robust and flexible Geospatial Modeling Interface (GMI) for deploying and evaluating natural resource models

    USDA-ARS?s Scientific Manuscript database

    Geographical information systems (GIS) software packages have been used for nearly three decades as analytical tools in natural resource management for geospatial data assembly, processing, storage, and visualization of input data and model output. However, with increasing availability and use of fu...

  1. Dynamic Factor Analysis Models with Time-Varying Parameters

    ERIC Educational Resources Information Center

    Chow, Sy-Miin; Zu, Jiyun; Shifren, Kim; Zhang, Guangjian

    2011-01-01

    Dynamic factor analysis models with time-varying parameters offer a valuable tool for evaluating multivariate time series data with time-varying dynamics and/or measurement properties. We use the Dynamic Model of Activation proposed by Zautra and colleagues (Zautra, Potter, & Reich, 1997) as a motivating example to construct a dynamic factor…

  2. INVERSE MODELING TO ESTIMATE NH3 EMISSION SEASONALLY AND THE SENSITIVITY TO UNCERTAINTY REPRESENTATIONS

    EPA Science Inventory

    Inverse modeling has been used extensively on the global scale to produce top-down estimates of emissions for chemicals such as CO and CH4. Regional scale air quality studies could also benefit from inverse modeling as a tool to evaluate current emission inventories; however, ...

  3. Safety in the Chemical Laboratory: Laboratory Air Quality: Part I. A Concentration Model.

    ERIC Educational Resources Information Center

    Butcher, Samuel S.; And Others

    1985-01-01

    Offers a simple model for estimating vapor concentrations in instructional laboratories. Three methods are described for measuring ventilation rates, and the results of measurements in six laboratories are presented. The model should provide a simple screening tool for evaluating worst-case personal exposures. (JN)

  4. "Best Case/Worst Case": Qualitative Evaluation of a Novel Communication Tool for Difficult in-the-Moment Surgical Decisions.

    PubMed

    Kruser, Jacqueline M; Nabozny, Michael J; Steffens, Nicole M; Brasel, Karen J; Campbell, Toby C; Gaines, Martha E; Schwarze, Margaret L

    2015-09-01

    To evaluate a communication tool called "Best Case/Worst Case" (BC/WC) based on an established conceptual model of shared decision-making. Focus group study. Older adults (four focus groups) and surgeons (two focus groups) using modified questions from the Decision Aid Acceptability Scale and the Decisional Conflict Scale to evaluate and revise the communication tool. Individuals aged 60 and older recruited from senior centers (n = 37) and surgeons from academic and private practices in Wisconsin (n = 17). Qualitative content analysis was used to explore themes and concepts that focus group respondents identified. Seniors and surgeons praised the tool for the unambiguous illustration of multiple treatment options and the clarity gained from presentation of an array of treatment outcomes. Participants noted that the tool provides an opportunity for in-the-moment, preference-based deliberation about options and a platform for further discussion with other clinicians and loved ones. Older adults worried that the format of the tool was not universally accessible for people with different educational backgrounds, and surgeons had concerns that the tool was vulnerable to physicians' subjective biases. The BC/WC tool is a novel decision support intervention that may help facilitate difficult decision-making for older adults and their physicians when considering invasive, acute medical treatments such as surgery. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  5. Evaluation of Medical Education virtual Program: P3 model.

    PubMed

    Rezaee, Rita; Shokrpour, Nasrin; Boroumand, Maryam

    2016-10-01

    In e-learning, people get involved in a process and create the content (product) and make it available for virtual learners. The present study was carried out in order to evaluate the first virtual master program in medical education at Shiraz University of Medical Sciences according to P3 Model. This is an evaluation research study with post single group design used to determine how effective this program was. All students 60 who participated more than one year in this virtual program and 21 experts including teachers and directors participated in this evaluation project. Based on the P3 e-learning model, an evaluation tool with 5-point Likert rating scale was designed and applied to collect the descriptive data. Students reported storyboard and course design as the most desirable element of learning environment (2.30±0.76), but they declared technical support as the less desirable part (1.17±1.23). Presence of such framework in this regard and using it within the format of appropriate tools for evaluation of e-learning in universities and higher education institutes, which present e-learning curricula in the country, may contribute to implementation of the present and future e-learning curricula efficiently and guarantee its implementation in an appropriate way.

  6. Evaluation of Medical Education virtual Program: P3 model

    PubMed Central

    REZAEE, RITA; SHOKRPOUR, NASRIN; BOROUMAND, MARYAM

    2016-01-01

    Introduction: In e-learning, people get involved in a process and create the content (product) and make it available for virtual learners. The present study was carried out in order to evaluate the first virtual master program in medical education at Shiraz University of Medical Sciences according to P3 Model. Methods: This is an evaluation research study with post single group design used to determine how effective this program was. All students 60 who participated more than one year in this virtual program and 21 experts including teachers and directors participated in this evaluation project. Based on the P3 e-learning model, an evaluation tool with 5-point Likert rating scale was designed and applied to collect the descriptive data. Results: Students reported storyboard and course design as the most desirable element of learning environment (2.30±0.76), but they declared technical support as the less desirable part (1.17±1.23). Conclusion: Presence of such framework in this regard and using it within the format of appropriate tools for evaluation of e-learning in universities and higher education institutes, which present e-learning curricula in the country, may contribute to implementation of the present and future e-learning curricula efficiently and guarantee its implementation in an appropriate way. PMID:27795971

  7. The AgESGUI geospatial simulation system for environmental model application and evaluation

    USDA-ARS?s Scientific Manuscript database

    Practical decision making in spatially-distributed environmental assessment and management is increasingly being based on environmental process-based models linked to geographical information systems (GIS). Furthermore, powerful computers and Internet-accessible assessment tools are providing much g...

  8. LOUISIANA ENVIRONMENTAL MODELING SYSTEM FOR HYPOXIA RELATED ISSUES

    EPA Science Inventory

    An environmental assessment tool to evaluate the impacts of nonpoint source (NPS) pollutants discharged from Mississippi River basins into the Gulf of Mexico and to assess their effects on receiving water quality will be described. This system (Louisiana Environmental Modeling S...

  9. Evaluation of the BreastSimulator software platform for breast tomography

    NASA Astrophysics Data System (ADS)

    Mettivier, G.; Bliznakova, K.; Sechopoulos, I.; Boone, J. M.; Di Lillo, F.; Sarno, A.; Castriconi, R.; Russo, P.

    2017-08-01

    The aim of this work was the evaluation of the software BreastSimulator, a breast x-ray imaging simulation software, as a tool for the creation of 3D uncompressed breast digital models and for the simulation and the optimization of computed tomography (CT) scanners dedicated to the breast. Eight 3D digital breast phantoms were created with glandular fractions in the range 10%-35%. The models are characterised by different sizes and modelled realistic anatomical features. X-ray CT projections were simulated for a dedicated cone-beam CT scanner and reconstructed with the FDK algorithm. X-ray projection images were simulated for 5 mono-energetic (27, 32, 35, 43 and 51 keV) and 3 poly-energetic x-ray spectra typically employed in current CT scanners dedicated to the breast (49, 60, or 80 kVp). Clinical CT images acquired from two different clinical breast CT scanners were used for comparison purposes. The quantitative evaluation included calculation of the power-law exponent, β, from simulated and real breast tomograms, based on the power spectrum fitted with a function of the spatial frequency, f, of the form S(f)  =  α/f   β . The breast models were validated by comparison against clinical breast CT and published data. We found that the calculated β coefficients were close to that of clinical CT data from a dedicated breast CT scanner and reported data in the literature. In evaluating the software package BreastSimulator to generate breast models suitable for use with breast CT imaging, we found that the breast phantoms produced with the software tool can reproduce the anatomical structure of real breasts, as evaluated by calculating the β exponent from the power spectral analysis of simulated images. As such, this research tool might contribute considerably to the further development, testing and optimisation of breast CT imaging techniques.

  10. A Five- Year CMAQ Model Performance for Wildfires and ...

    EPA Pesticide Factsheets

    Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  11. Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.

    PubMed

    Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun

    2015-11-07

    In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.

  12. Turning Continuous Quality Improvement into Institutional Practice: The Tools and Techniques.

    ERIC Educational Resources Information Center

    Cornesky, Robert A.

    This manual is intended to assist managers of support units at institutions of higher education in the implementation of Continuous Quality Improvement (CQI). The purpose is to describe a cooperative model for CQI which will permit managers to evaluate the quality of their units and institution, and by using the described tools and techniques, to…

  13. Adaptive Comparative Judgment as a Tool for Assessing Open-Ended Design Problems and Model Eliciting Activities

    ERIC Educational Resources Information Center

    Bartholomew, Scott R.; Nadelson, Louis S.; Goodridge, Wade H.; Reeve, Edward M.

    2018-01-01

    We investigated the use of adaptive comparative judgment to evaluate the middle school student learning, engagement, and experience with the design process in an open-ended problem assigned in a technology and engineering education course. Our results indicate that the adaptive comparative judgment tool effectively facilitated the grading of the…

  14. The Risk-Informed Materials Management (RIMM) Tool System for Determining Safe-Levels of Contaminated Materials Managed on the Land

    EPA Science Inventory

    EPA’s Risk-Informed Materials Management (RIMM) tool system is a modeling approach that helps risk assessors evaluate the safety of managing raw, reused, or waste material streams via a variety of common scenarios (e.g., application to farms, use as a component in road cons...

  15. An Examination and Validation of an Adapted Youth Experience Scale for University Sport

    ERIC Educational Resources Information Center

    Rathwell, Scott; Young, Bradley W.

    2016-01-01

    Limited tools assess positive development through university sport. Such a tool was validated in this investigation using two independent samples of Canadian university athletes. In Study 1, 605 athletes completed 99 survey items drawn from the Youth Experience Scale (YES 2.0), and separate a priori measurement models were evaluated (i.e., 99…

  16. ICT Strategies and Tools for the Improvement of Instructional Supervision. The Virtual Supervision

    ERIC Educational Resources Information Center

    Cano, Esteban Vazquez; Garcia, Ma. Luisa Sevillano

    2013-01-01

    This study aims to evaluate and analyze strategies, proposals, and ICT tools to promote a paradigm shift in educational supervision that enhances the schools of this century involved not only in teaching-face learning, but e-learning and blended learning. Traditional models of educational supervision do not guarantee adequate supervision of the…

  17. Evaluation of Curriculum and Student Learning Needs Using 360 Degree Assessment

    ERIC Educational Resources Information Center

    Ladyshewsky, Richard; Taplin, Ross

    2015-01-01

    This research used a 360 degree assessment tool modelled from the competing values framework to assess the curriculum. A total of 100 Master's of Business Administration students and 746 of their work colleagues completed the 360 degree assessment tool. The students were enrolled in a course on leadership and management. The results of the…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, J; Balter, P; Court, L

    Purpose: To evaluate the performance of commercially available automatic segmentation tools built into treatment planning systems (TPS) in terms of their segmentation accuracy and flexibility in customization. Methods: Twelve head-and-neck cancer patients and twelve thoracic cancer patients were retrospectively selected to benchmark the model-based segmentation (MBS) and atlas-based segmentation (ABS) in RayStation TPS and the Smart Probabilistic Image Contouring Engine (SPICE) in Pinnacle TPS. Multi-atlas contouring service (MACS) that was developed in-house as a plug-in of Pinnacle TPS was evaluated as well. Manual contours used in clinic were reviewed and modified for consistency and served as ground truth for themore » evaluation. Head-and-neck evaluation included six regions of interest (ROIs): left and right parotid glands, brainstem, spinal cord, mandible, and submandibular glands. Thoracic evaluation includes seven ROIs: left and right lungs, spinal cord, heart, esophagus, and left and right brachial plexus. Auto-segmented contours were compared with the manual contours using the Dice similarity coefficient (DSC) and the mean surface distance (MSD). Results: In head- and-neck evaluation, only mandible has a high accuracy in all segmentations (DSC>85%); SPICE achieved DSC>70% for parotid glands; MACS achieved this for both parotid glands and submandibular glands; and RayStation ABS achieved this for spinal cord. In thoracic evaluation, SPICE achieved the best in lung and heart segmentation, while MACS achieved the best for all other structures. The less distinguishable structures on CT images, such as brainstem, spinal cord, parotid glands, submandibular glands, esophagus, and brachial plexus, showed great variability in different segmentation tools (mostly DSC<70% and MSD>3mm). The template for RayStation ABS can be easily customized by users, while RayStation MBS and SPICE rely on the vendors to provide the templates/models. Conclusion: Great variability was observed in different segmentation tools applied to different structures. These commercially-available segmentation tools should be carefully evaluated before clinical use.« less

  19. Pressure distribution under flexible polishing tools. II - Cylindrical (conical) optics

    NASA Astrophysics Data System (ADS)

    Mehta, Pravin K.

    1990-10-01

    A previously developed eigenvalue model is extended to determine polishing pressure distribution by rectangular tools with unequal stiffness in two directions on cylindrical optics. Tool misfit is divided into two simplified one-dimensional problems and one simplified two-dimensional problem. Tools with nonuniform cross-sections are treated with a new one-dimensional eigenvalue algorithm, permitting evaluation of tool designs where the edge is more flexible than the interior. This maintains edge pressure variations within acceptable parameters. Finite element modeling is employed to resolve upper bounds, which handle pressure changes in the two-dimensional misfit element. Paraboloids and hyperboloids from the NASA AXAF system are treated with the AXAFPOD software for this method, and are verified with NASTRAN finite element analyses. The maximum deviation from the one-dimensional azimuthal pressure variation is predicted to be 10 percent and 20 percent for paraboloids and hyperboloids, respectively.

  20. A web tool for STORET/WQX water quality data retrieval and Best Management Practice scenario suggestion.

    PubMed

    Park, Youn Shik; Engel, Bernie A; Kim, Jonggun; Theller, Larry; Chaubey, Indrajeet; Merwade, Venkatesh; Lim, Kyoung Jae

    2015-03-01

    Total Maximum Daily Load is a water quality standard to regulate water quality of streams, rivers and lakes. A wide range of approaches are used currently to develop TMDLs for impaired streams and rivers. Flow and load duration curves (FDC and LDC) have been used in many states to evaluate the relationship between flow and pollutant loading along with other models and approaches. A web-based LDC Tool was developed to facilitate development of FDC and LDC as well as to support other hydrologic analyses. In this study, the FDC and LDC tool was enhanced to allow collection of water quality data via the web and to assist in establishing cost-effective Best Management Practice (BMP) implementations. The enhanced web-based tool provides use of water quality data not only from the US Geological Survey but also from the Water Quality Portal for the U.S. via web access. Moreover, the web-based tool identifies required pollutant reductions to meet standard loads and suggests a BMP scenario based on ability of BMPs to reduce pollutant loads, BMP establishment and maintenance costs. In the study, flow and water quality data were collected via web access to develop LDC and to identify the required reduction. The suggested BMP scenario from the web-based tool was evaluated using the EPA Spreadsheet Tool for the Estimation of Pollutant Load model to attain the required pollutant reduction at least cost. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  2. Development of a simplified urban water balance model (WABILA).

    PubMed

    Henrichs, M; Langner, J; Uhl, M

    2016-01-01

    During the last decade, water sensitive urban design (WSUD) has become more and more accepted. However, there is not any simple tool or option available to evaluate the influence of these measures on the local water balance. To counteract the impact of new settlements, planners focus on mitigating increases in runoff through installation of infiltration systems. This leads to an increasing non-natural groundwater recharge and decreased evapotranspiration. Simple software tools which evaluate or simulate the effect of WSUD on the local water balance are still needed. The authors developed a tool named WABILA (Wasserbilanz) that could support planners for optimal WSUD. WABILA is an easy-to-use planning tool that is based on simplified regression functions for established measures and land covers. Results show that WSUD has to be site-specific, based on climate conditions and the natural water balance.

  3. Participatory ergonomics simulation of hospital work systems: The influence of simulation media on simulation outcome.

    PubMed

    Andersen, Simone Nyholm; Broberg, Ole

    2015-11-01

    Current application of work system simulation in participatory ergonomics (PE) design includes a variety of different simulation media. However, the actual influence of the media attributes on the simulation outcome has received less attention. This study investigates two simulation media: full-scale mock-ups and table-top models. The aim is to compare, how the media attributes of fidelity and affordance influence the ergonomics identification and evaluation in PE design of hospital work systems. The results illustrate, how the full-scale mock-ups' high fidelity of room layout and affordance of tool operation support ergonomics identification and evaluation related to the work system entities space and technologies & tools. The table-top models' high fidelity of function relations and affordance of a helicopter view support ergonomics identification and evaluation related to the entity organization. Furthermore, the study addresses the form of the identified and evaluated conditions, being either identified challenges or tangible design criteria. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Evaluating conflation methods using uncertainty modeling

    NASA Astrophysics Data System (ADS)

    Doucette, Peter; Dolloff, John; Canavosio-Zuzelski, Roberto; Lenihan, Michael; Motsko, Dennis

    2013-05-01

    The classic problem of computer-assisted conflation involves the matching of individual features (e.g., point, polyline, or polygon vectors) as stored in a geographic information system (GIS), between two different sets (layers) of features. The classical goal of conflation is the transfer of feature metadata (attributes) from one layer to another. The age of free public and open source geospatial feature data has significantly increased the opportunity to conflate such data to create enhanced products. There are currently several spatial conflation tools in the marketplace with varying degrees of automation. An ability to evaluate conflation tool performance quantitatively is of operational value, although manual truthing of matched features is laborious and costly. In this paper, we present a novel methodology that uses spatial uncertainty modeling to simulate realistic feature layers to streamline evaluation of feature matching performance for conflation methods. Performance results are compiled for DCGIS street centerline features.

  5. Advantages and limitations of the Five Domains model for assessing welfare impacts associated with vertebrate pest control.

    PubMed

    Beausoleil, N J; Mellor, D J

    2015-01-01

    Many pest control activities have the potential to impact negatively on the welfare of animals, and animal welfare is an important consideration in the development, implementation and evaluation of ethically defensible vertebrate pest control. Thus, reliable and accurate methods for assessing welfare impacts are required. The Five Domains model provides a systematic method for identifying potential or actual welfare impacts associated with an event or situation in four physical or functional domains (nutrition, environment, health or functional status, behaviour) and one mental domain (overall mental or affective state). Here we evaluate the advantages and limitations of the Five Domains model for this purpose and illustrate them using specific examples from a recent assessment of the welfare impacts of poisons used to lethally control possums in New Zealand. The model has a number of advantages which include the following: the systematic identification of a wide range of impacts associated with a variety of control tools; the production of relative rankings of tools in terms of their welfare impacts; the easy incorporation of new information into assessments; and the highlighting of additional information needed. For example, a recent analysis of sodium fluoroacetate (1080) poisoning in possums revealed the need for more information on the period from the onset of clinical signs to the point at which consciousness is lost, as well as on the level of consciousness during or after the occurrence of muscle spasms and seizures. The model is also valuable because it clearly separates physical or functional and affective impacts, encourages more comprehensive consideration of negative affective experiences than has occurred in the past, and allows development and evaluation of targeted mitigation strategies. Caution must be used in interpreting and applying the outputs of the model, most importantly because relative rankings or grades are fundamentally qualitative in nature. Certain domains are more useful for evaluating impacts associated with slower/longer-acting tools than for faster-acting methods, and it may be easier to identify impacts in some domains than others. Overall, we conclude that the Five Domains model advances evaluation of the animal welfare impacts of vertebrate pest control methods, provided users are cognisant of its limitations.

  6. Milestone-specific, Observed data points for evaluating levels of performance (MODEL) assessment strategy for anesthesiology residency programs.

    PubMed

    Nagy, Christopher J; Fitzgerald, Brian M; Kraus, Gregory P

    2014-01-01

    Anesthesiology residency programs will be expected to have Milestones-based evaluation systems in place by July 2014 as part of the Next Accreditation System. The San Antonio Uniformed Services Health Education Consortium (SAUSHEC) anesthesiology residency program developed and implemented a Milestones-based feedback and evaluation system a year ahead of schedule. It has been named the Milestone-specific, Observed Data points for Evaluating Levels of performance (MODEL) assessment strategy. The "MODEL Menu" and the "MODEL Blueprint" are tools that other anesthesiology residency programs can use in developing their own Milestones-based feedback and evaluation systems prior to ACGME-required implementation. Data from our early experience with the streamlined MODEL blueprint assessment strategy showed substantially improved faculty compliance with reporting requirements. The MODEL assessment strategy provides programs with a workable assessment method for residents, and important Milestones data points to programs for ACGME reporting.

  7. The Cryosphere Model Comparison Tool (CmCt): Ice Sheet Model Validation and Comparison Tool for Greenland and Antarctica

    NASA Astrophysics Data System (ADS)

    Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.

    2017-12-01

    The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.

  8. Turbine Engine Research Center (TERC) Data System Enhancement and Test Article Evaluation. Delivery Order 0002: TERC Aeromechanical Characterization

    DTIC Science & Technology

    2005-06-01

    test, the entire turbulence model was changed from standard k- epsilon to Spalart- Allmaras. Using these different tools of turbulence models, a few...this research, leaving only pre-existing finite element models to be used. At some point a NASTRAN model was developed for vibrations analysis but

  9. EVALUATING REGIONAL PREDICTIVE CAPACITY OF A PROCESS-BASED MERCURY EXPOSURE MODEL, REGIONAL-MERCURY CYCLING MODEL (R-MCM), APPLIED TO 91 VERMONT AND NEW HAMPSHIRE LAKES AND PONDS, USA

    EPA Science Inventory

    Regulatory agencies must develop fish consumption advisories for many lakes and rivers with limited resources. Process-based mathematical models are potentially valuable tools for developing regional fish advisories. The Regional Mercury Cycling model (R-MCM) was specifically d...

  10. A mathematical model of salmonid spawning habitat

    Treesearch

    Robert N. Havis; Carlos V. Alonzo; Keith E Woeste; Russell F. Thurow

    1993-01-01

    A simulation model [Salmonid Spawning Analysis Model (SSAM)I was developed as a management tool to evaluate the relative impacts of stream sediment load and water temperature on salmonid egg survival. The model is usefi.il for estimating acceptable sediment loads to spawning habitat that may result from upland development, such as logging and agriculture. Software in...

  11. Methods for assessing fracture risk prediction models: experience with FRAX in a large integrated health care delivery system.

    PubMed

    Pressman, Alice R; Lo, Joan C; Chandra, Malini; Ettinger, Bruce

    2011-01-01

    Area under the receiver operating characteristics (AUROC) curve is often used to evaluate risk models. However, reclassification tests provide an alternative assessment of model performance. We performed both evaluations on results from FRAX (World Health Organization Collaborating Centre for Metabolic Bone Diseases, University of Sheffield, UK), a fracture risk tool, using Kaiser Permanente Northern California women older than 50yr with bone mineral density (BMD) measured during 1997-2003. We compared FRAX performance with and without BMD in the model. Among 94,489 women with mean follow-up of 6.6yr, 1579 (1.7%) sustained a hip fracture. Overall, AUROCs were 0.83 and 0.84 for FRAX without and with BMD, suggesting that BMD did not contribute to model performance. AUROC decreased with increasing age, and BMD contributed significantly to higher AUROC among those aged 70yr and older. Using an 81% sensitivity threshold (optimum level from receiver operating characteristic curve, corresponding to 1.2% cutoff), 35% of those categorized above were reassigned below when BMD was added. In contrast, only 10% of those categorized below were reassigned to the higher risk category when BMD was added. The net reclassification improvement was 5.5% (p<0.01). Two versions of this risk tool have similar AUROCs, but alternative assessments indicate that addition of BMD improves performance. Multiple methods should be used to evaluate risk tool performance with less reliance on AUROC alone. Copyright © 2011 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  12. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  13. OCAM - A CELSS modeling tool: Description and results. [Object-oriented Controlled Ecological Life Support System Analysis and Modeling

    NASA Technical Reports Server (NTRS)

    Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray

    1992-01-01

    Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.

  14. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    NASA Technical Reports Server (NTRS)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  15. Integration of eHealth Tools in the Process of Workplace Health Promotion: Proposal for Design and Implementation

    PubMed Central

    2018-01-01

    Background Electronic health (eHealth) and mobile health (mHealth) tools can support and improve the whole process of workplace health promotion (WHP) projects. However, several challenges and opportunities have to be considered while integrating these tools in WHP projects. Currently, a large number of eHealth tools are developed for changing health behavior, but these tools can support the whole WHP process, including group administration, information flow, assessment, intervention development process, or evaluation. Objective To support a successful implementation of eHealth tools in the whole WHP processes, we introduce a concept of WHP (life cycle model of WHP) with 7 steps and present critical and success factors for the implementation of eHealth tools in each step. Methods We developed a life cycle model of WHP based on the World Health Organization (WHO) model of healthy workplace continual improvement process. We suggest adaptations to the WHO model to demonstrate the large number of possibilities to implement eHealth tools in WHP as well as possible critical points in the implementation process. Results eHealth tools can enhance the efficiency of WHP in each of the 7 steps of the presented life cycle model of WHP. Specifically, eHealth tools can support by offering easier administration, providing an information and communication platform, supporting assessments, presenting and discussing assessment results in a dashboard, and offering interventions to change individual health behavior. Important success factors include the possibility to give automatic feedback about health parameters, create incentive systems, or bring together a large number of health experts in one place. Critical factors such as data security, anonymity, or lack of management involvement have to be addressed carefully to prevent nonparticipation and dropouts. Conclusions Using eHealth tools can support WHP, but clear regulations for the usage and implementation of these tools at the workplace are needed to secure quality and reach sustainable results. PMID:29475828

  16. Analysis of key technologies for virtual instruments metrology

    NASA Astrophysics Data System (ADS)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  17. Modeling Cover Crop Effectiveness on Maryland's Eastern Shore

    USDA-ARS?s Scientific Manuscript database

    The value of watershed-scale, hydrologic/water quality models to ecosystem management is increasingly evident as more programs adopt these tools to evaluate the effectiveness of different management scenarios and their impact on the environment. Quality of precipitation data is critical for appropri...

  18. Watershed Management Tool for Selection and Spacial Allocation of Non-Point Source Pollution Control Practices

    EPA Science Inventory

    Distributed-parameter watershed models are often utilized for evaluating the effectiveness of sediment and nutrient abatement strategies through the traditional {calibrate→ validate→ predict} approach. The applicability of the method is limited due to modeling approximations. In ...

  19. A standard-enabled workflow for synthetic biology.

    PubMed

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  20. Cure Models as a Useful Statistical Tool for Analyzing Survival

    PubMed Central

    Othus, Megan; Barlogie, Bart; LeBlanc, Michael L.; Crowley, John J.

    2013-01-01

    Cure models are a popular topic within statistical literature but are not as widely known in the clinical literature. Many patients with cancer can be long-term survivors of their disease, and cure models can be a useful tool to analyze and describe cancer survival data. The goal of this article is to review what a cure model is, explain when cure models can be used, and use cure models to describe multiple myeloma survival trends. Multiple myeloma is generally considered an incurable disease, and this article shows that by using cure models, rather than the standard Cox proportional hazards model, we can evaluate whether there is evidence that therapies at the University of Arkansas for Medical Sciences induce a proportion of patients to be long-term survivors. PMID:22675175

  1. Scheduling algorithm for mission planning and logistics evaluation users' guide

    NASA Technical Reports Server (NTRS)

    Chang, H.; Williams, J. M.

    1976-01-01

    The scheduling algorithm for mission planning and logistics evaluation (SAMPLE) program is a mission planning tool composed of three subsystems; the mission payloads subsystem (MPLS), which generates a list of feasible combinations from a payload model for a given calendar year; GREEDY, which is a heuristic model used to find the best traffic model; and the operations simulation and resources scheduling subsystem (OSARS), which determines traffic model feasibility for available resources. The SAMPLE provides the user with options to allow the execution of MPLS, GREEDY, GREEDY-OSARS, or MPLS-GREEDY-OSARS.

  2. Introducing Reflexivity to Evaluation Practice: An In-Depth Case Study

    ERIC Educational Resources Information Center

    van Draanen, Jenna

    2017-01-01

    There is currently a paucity of literature in the field of evaluation regarding the practice of reflection and reflexivity and a lack of available tools to guide this practice--yet using a reflexive model can enhance evaluation practice. This paper focuses on the methods and results of a reflexive inquiry that was conducted during a participatory…

  3. Disease management with ARIMA model in time series.

    PubMed

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  4. Using prediction uncertainty analysis to design hydrologic monitoring networks: Example applications from the Great Lakes water availability pilot project

    USGS Publications Warehouse

    Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.

    2010-01-01

    The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.

  5. An Overview of Atmospheric Chemistry and Air Quality Modeling

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew S.

    2017-01-01

    This presentation will include my personal research experience and an overview of atmospheric chemistry and air quality modeling to the participants of the NASA Student Airborne Research Program (SARP 2017). The presentation will also provide examples on ways to apply airborne observations for chemical transport (CTM) and air quality (AQ) model evaluation. CTM and AQ models are important tools in understanding tropospheric-stratospheric composition, atmospheric chemistry processes, meteorology, and air quality. This presentation will focus on how NASA scientist currently apply CTM and AQ models to better understand these topics. Finally, the importance of airborne observation in evaluating these topics and how in situ and remote sensing observations can be used to evaluate and improve CTM and AQ model predictions will be highlighted.

  6. High Thermal Conductivity Polymer Composites for Low Cost Heat Exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2017-08-01

    This factsheet describes a project that identified and evaluated commercially available and state-of-the-art polymer-based material options for manufacturing industrial and commercial non-metallic heat exchangers. A heat exchanger concept was also developed and its performance evaluated with heat transfer modeling tools.

  7. Extra-Tropical Cyclones at Climate Scales: Comparing Models to Observations

    NASA Astrophysics Data System (ADS)

    Tselioudis, G.; Bauer, M.; Rossow, W.

    2009-04-01

    Climate is often defined as the accumulation of weather, and weather is not the concern of climate models. Justification for this latter sentiment has long been hidden behind coarse model resolutions and blunt validation tools based on climatological maps. The spatial-temporal resolutions of today's climate models and observations are converging onto meteorological scales, however, which means that with the correct tools we can test the largely unproven assumption that climate model weather is correct enough that its accumulation results in a robust climate simulation. Towards this effort we introduce a new tool for extracting detailed cyclone statistics from observations and climate model output. These include the usual cyclone characteristics (centers, tracks), but also adaptive cyclone-centric composites. We have created a novel dataset, the MAP Climatology of Mid-latitude Storminess (MCMS), which provides a detailed 6 hourly assessment of the areas under the influence of mid-latitude cyclones, using a search algorithm that delimits the boundaries of each system from the outer-most closed SLP contour. Using this we then extract composites of cloud, radiation, and precipitation properties from sources such as ISCCP and GPCP to create a large comparative dataset for climate model validation. A demonstration of the potential usefulness of these tools in process-based climate model evaluation studies will be shown.

  8. Learning the ShamWow: Creating Infomercials to Teach the AIDA Model

    ERIC Educational Resources Information Center

    Lee, Seung Hwan; Hoffman, K. Douglas

    2015-01-01

    The AIDA Model (Attention-Interest-Desire-Action) is one of the classical promotional theories in marketing. Through active-learning techniques and peer critiques, we use infomercials as an innovative educational tool to instruct the four components of the AIDA model. Student evaluations regarding this active-learning assignment reveal that the…

  9. DO3SE modelling of soil moisture to determine ozone flux to forest trees

    Treesearch

    P. Büker; T. Morrissey; A. Briolat; R. Falk; D. Simpson; J.-P. Tuovinen; R. Alonso; S. Barth; M. Baumgarten; N. Grulke; P.E. Karlsson; J. King; F. Lagergren; R. Matyssek; A. Nunn; R. Ogaya; J. Peñuelas; L. Rhea; M. Schaub; J. Uddling; W. Werner; L.D. Emberson

    2012-01-01

    The DO3SE (Deposition of O3 for Stomatal Exchange) model is an established tool for estimating ozone (O3) deposition, stomatal flux and impacts to a variety of vegetation types across Europe. It has been embedded within the EMEP (European Monitoring and Evaluation Programme) photochemical model to...

  10. Lysimetric evaluation of the APEX Model to simulate daily ET for irrigated crops in the Texas High Plains

    USDA-ARS?s Scientific Manuscript database

    The NTT (Nutrient Tracking Tool) was designed to provide an opportunity for all users, including producers, to simulate the complex models, such as APEX (Agricultural Policy Environmental eXtender) and associated required databases. The APEX model currently nested within NTT provides estimates of th...

  11. Decision Support Tool Evaluation Report for General NOAA Oil Modeling Environment(GNOME) Version 2.0

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.; Hall, Callie; Zanoni, Vicki; Blonski, Slawomir; D'Sa, Eurico; Estep, Lee; Holland, Donald; Moore, Roxzana F.; Pagnutti, Mary; Terrie, Gregory

    2004-01-01

    NASA's Earth Science Applications Directorate evaluated the potential of NASA remote sensing data and modeling products to enhance the General NOAA Oil Modeling Environment (GNOME) decision support tool. NOAA's Office of Response and Restoration (OR&R) Hazardous Materials (HAZMAT) Response Division is interested in enhancing GNOME with near-realtime (NRT) NASA remote sensing products on oceanic winds and ocean circulation. The NASA SeaWinds sea surface wind and Jason-1 sea surface height NRT products have potential, as do sea surface temperature and reflectance products from the Moderate Resolution Imaging Spectroradiometer and sea surface reflectance products from Landsat and the Advanced Spaceborne Thermal Emission and Reflectance Radiometer. HAZMAT is also interested in the Advanced Circulation model and the Ocean General Circulation Model. Certain issues must be considered, including lack of data continuity, marginal data redundancy, and data formatting problems. Spatial resolution is an issue for near-shore GNOME applications. Additional work will be needed to incorporate NASA inputs into GNOME, including verification and validation of data products, algorithms, models, and NRT data.

  12. Projected 2050 Model Simulations for the Chesapeake Bay ...

    EPA Pesticide Factsheets

    The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and future, 2050, Weather Research and Forecast (WRF) metrological and Community Multiscale Air Quality (CMAQ) chemical transport model simulations to provide meteorological and nutrient deposition estimates for inclusion of the Chesapeake Bay Program’s assessment of how climate and land use change may impact water quality and ecosystem health. This presentation will present the timeline and research updates. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  13. Satellite Sounder Data Assimilation for Improving Alaska Region Weather Forecast

    NASA Technical Reports Server (NTRS)

    Zhu, Jiang; Stevens, E.; Zhang, X.; Zavodsky, B. T.; Heinrichs, T.; Broderson, D.

    2014-01-01

    A case study and monthly statistical analysis using sounder data assimilation to improve the Alaska regional weather forecast model are presented. Weather forecast in Alaska faces challenges as well as opportunities. Alaska has a large land with multiple types of topography and coastal area. Weather forecast models must be finely tuned in order to accurately predict weather in Alaska. Being in the high-latitudes provides Alaska greater coverage of polar orbiting satellites for integration into forecasting models than the lower 48. Forecasting marine low stratus clouds is critical to the Alaska aviation and oil industry and is the current focus of the case study. NASA AIRS/CrIS sounder profiles data are used to do data assimilation for the Alaska regional weather forecast model to improve Arctic marine stratus clouds forecast. Choosing physical options for the WRF model is discussed. Preprocess of AIRS/CrIS sounder data for data assimilation is described. Local observation data, satellite data, and global data assimilation data are used to verify and/or evaluate the forecast results by the MET tools Model Evaluation Tools (MET).

  14. A model for evaluating the activities of a coalition-based policy action group: the case of Hermosa Vida.

    PubMed

    Hardy, Lisa Jane; Wertheim, Peter; Bohan, Kyle; Quezada, Julio Cesar; Henley, Eric

    2013-07-01

    Scholars and clinicians are increasingly recognizing the complexity of social contexts of health and the need for multifunctioning approaches to health care problems including community- and policy-level strategies. Barriers to change in health care policy can sometimes be attributed to the actions of advocacy coalitions who operate from a limited view of "policy change." Advocates have a tendency to pressure stakeholders to mandate laws as a final resolution of a movement, often leading to failure or, worse, stigmatizing of issues. A more inclusive focus on health policy change as an ongoing process increases the efficacy of advocacy and outcomes measurement. This article presents a tool for policy action that coalition members developed through the implementation of a 3-year grant to improve the safety net for preventing childhood obesity. Scholars and policy makers developed the Policy Coalition Evaluation Tool with the intent to create a model to guide and measure efforts and outcomes of a local community-based policy coalition. The authors suggest using community-based participatory research approaches for developing a coalition-specific Policy Coalition Evaluation Tool to increase the effectiveness of advocacy groups and the documentation of coalition activities over time.

  15. A three-dimensional inverse finite element analysis of the heel pad.

    PubMed

    Chokhandre, Snehal; Halloran, Jason P; van den Bogert, Antonie J; Erdemir, Ahmet

    2012-03-01

    Quantification of plantar tissue behavior of the heel pad is essential in developing computational models for predictive analysis of preventive treatment options such as footwear for patients with diabetes. Simulation based studies in the past have generally adopted heel pad properties from the literature, in return using heel-specific geometry with material properties of a different heel. In exceptional cases, patient-specific material characterization was performed with simplified two-dimensional models, without further evaluation of a heel-specific response under different loading conditions. The aim of this study was to conduct an inverse finite element analysis of the heel in order to calculate heel-specific material properties in situ. Multidimensional experimental data available from a previous cadaver study by Erdemir et al. ("An Elaborate Data Set Characterizing the Mechanical Response of the Foot," ASME J. Biomech. Eng., 131(9), pp. 094502) was used for model development, optimization, and evaluation of material properties. A specimen-specific three-dimensional finite element representation was developed. Heel pad material properties were determined using inverse finite element analysis by fitting the model behavior to the experimental data. Compression dominant loading, applied using a spherical indenter, was used for optimization of the material properties. The optimized material properties were evaluated through simulations representative of a combined loading scenario (compression and anterior-posterior shear) with a spherical indenter and also of a compression dominant loading applied using an elevated platform. Optimized heel pad material coefficients were 0.001084 MPa (μ), 9.780 (α) (with an effective Poisson's ratio (ν) of 0.475), for a first-order nearly incompressible Ogden material model. The model predicted structural response of the heel pad was in good agreement for both the optimization (<1.05% maximum tool force, 0.9% maximum tool displacement) and validation cases (6.5% maximum tool force, 15% maximum tool displacement). The inverse analysis successfully predicted the material properties for the given specimen-specific heel pad using the experimental data for the specimen. The modeling framework and results can be used for accurate predictions of the three-dimensional interaction of the heel pad with its surroundings.

  16. An automated process for building reliable and optimal in vitro/in vivo correlation models based on Monte Carlo simulations.

    PubMed

    Sutton, Steven C; Hu, Mingxiu

    2006-05-05

    Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.

  17. PFIM 4.0, an extended R program for design evaluation and optimization in nonlinear mixed-effect models.

    PubMed

    Dumont, Cyrielle; Lestini, Giulia; Le Nagard, Hervé; Mentré, France; Comets, Emmanuelle; Nguyen, Thu Thuy; Group, For The Pfim

    2018-03-01

    Nonlinear mixed-effect models (NLMEMs) are increasingly used for the analysis of longitudinal studies during drug development. When designing these studies, the expected Fisher information matrix (FIM) can be used instead of performing time-consuming clinical trial simulations. The function PFIM is the first tool for design evaluation and optimization that has been developed in R. In this article, we present an extended version, PFIM 4.0, which includes several new features. Compared with version 3.0, PFIM 4.0 includes a more complete pharmacokinetic/pharmacodynamic library of models and accommodates models including additional random effects for inter-occasion variability as well as discrete covariates. A new input method has been added to specify user-defined models through an R function. Optimization can be performed assuming some fixed parameters or some fixed sampling times. New outputs have been added regarding the FIM such as eigenvalues, conditional numbers, and the option of saving the matrix obtained after evaluation or optimization. Previously obtained results, which are summarized in a FIM, can be taken into account in evaluation or optimization of one-group protocols. This feature enables the use of PFIM for adaptive designs. The Bayesian individual FIM has been implemented, taking into account a priori distribution of random effects. Designs for maximum a posteriori Bayesian estimation of individual parameters can now be evaluated or optimized and the predicted shrinkage is also reported. It is also possible to visualize the graphs of the model and the sensitivity functions without performing evaluation or optimization. The usefulness of these approaches and the simplicity of use of PFIM 4.0 are illustrated by two examples: (i) an example of designing a population pharmacokinetic study accounting for previous results, which highlights the advantage of adaptive designs; (ii) an example of Bayesian individual design optimization for a pharmacodynamic study, showing that the Bayesian individual FIM can be a useful tool in therapeutic drug monitoring, allowing efficient prediction of estimation precision and shrinkage for individual parameters. PFIM 4.0 is a useful tool for design evaluation and optimization of longitudinal studies in pharmacometrics and is freely available at http://www.pfim.biostat.fr. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Software Tools to Support the Assessment of System Health

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.

  19. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    PubMed

    Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; Aoyama, Hideki; Case, Keith

    2015-01-01

    Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process) and a fuzzy COmplex PRoportional ASsessment (COPRAS) for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  20. Evaluation of clinical information modeling tools.

    PubMed

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida un marco para la evaluación de herramientas informáticas diseñadas para dar soporte en la en los procesos de definición, gestión e implementación de estos modelos. El marco de evaluación propuesto se basa en una investigación previa para obtener consenso en la definición de requisitos esenciales en esta área. A partir de los 20 requisitos funcionales acordados, un conjunto de 50 criterios de conformidad fueron definidos y aplicados en la evaluación de las herramientas existentes. Un total de 9 de las 11 iniciativas identificadas desarrollando herramientas para el modelado de información clínica fueron evaluadas. Los resultados muestran que las funcionalidades relacionadas con la gestión de tipos de datos, especificaciones, metadatos y mapeo con terminologías u ontologías tienen un buen nivel de adopción. Se identifican posibles mejoras en áreas relacionadas con los procesos de modelado de información. Otros criterios relacionados con presentar las relaciones semánticas entre conceptos y la comunicación con servidores de terminología tienen un bajo nivel de adopción. El marco de evaluación propuesto fue probado y validado satisfactoriamente contra un conjunto representativo de las herramientas existentes. Los resultados identifican la necesidad de mejorar el soporte de herramientas a los procesos de modelado de información y desarrollo de software, especialmente en las áreas relacionadas con gobernanza, participación de profesionales clínicos y la optimización de la validación técnica en los procesos de pruebas técnicas. Esta investigación ha confirmado el potencial de este marco de evaluación para dar soporte a los usuarios en la toma de decisiones sobre que herramienta es más apropiadas para su organización. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. U.S. Army Workshop on Solid-Propellant Ignition and Combustion Modeling.

    DTIC Science & Technology

    1997-07-01

    saving tool in the design, development, testing, and evaluation of future gun-propulsion systems , and that, under current funding constraints, research...53 7.1 What systems are currently being addressed...9 ............. . . .. .. . . . . . . . . . . . . . . . . . . . . . . . . 56 7.5 What model systems might be valuable for

  2. Model Identification of Integrated ARMA Processes

    ERIC Educational Resources Information Center

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  3. Predictive Models and Tools for Assessing Chemicals under the Toxic Substances Control Act (TSCA)

    EPA Pesticide Factsheets

    EPA has developed databases and predictive models to help evaluate the hazard, exposure, and risk of chemicals released to the environment and how workers, the general public, and the environment may be exposed to and affected by them.

  4. COMMUNITY-SCALE MODELING FOR AIR TOXICS AND HOMELAND SECURITY

    EPA Science Inventory

    The purpose of this task is to develop and evaluate numerical and physical modeling tools for simulating ambient concentrations of airborne substances in urban settings at spatial scales ranging from <1-10 km. Research under this task will support client needs in human exposure ...

  5. OBJECTIVE REDUCTION OF THE SPACE-TIME DOMAIN DIMENSIONALITY FOR EVALUATING MODEL PERFORMANCE

    EPA Science Inventory

    In the United States, photochemical air quality models are the principal tools used by governmental agencies to develop emission reduction strategies aimed at achieving National Ambient Air Quality Standards (NAAQS). Before they can be applied with confidence in a regulatory sett...

  6. The Classification and Evaluation of Computer-Aided Software Engineering Tools

    DTIC Science & Technology

    1990-09-01

    International Business Machines Corporation Customizer is a Registered Trademark of Index Technology Corporation Data Analyst is a Registered Trademark of...years, a rapid series of new approaches have been adopted including: information engineering, entity- relationship modeling, automatic code generation...support true information sharing among tools and automated consistency checking. Moreover, the repository must record and manage the relationships and

  7. Proceedings of the Ninth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  8. Public biobanks: calculation and recovery of costs.

    PubMed

    Clément, Bruno; Yuille, Martin; Zaltoukal, Kurt; Wichmann, Heinz-Erich; Anton, Gabriele; Parodi, Barbara; Kozera, Lukasz; Bréchot, Christian; Hofman, Paul; Dagher, Georges

    2014-11-05

    A calculation grid developed by an international expert group was tested across biobanks in six countries to evaluate costs for collections of various types of biospecimens. The assessment yielded a tool for setting specimen-access prices that were transparently related to biobank costs, and the tool was applied across three models of collaborative partnership. Copyright © 2014, American Association for the Advancement of Science.

  9. Farm Simulation: a tool for evaluating the mitigation of greenhouse gas emissions and the adaptation of dairy production to climate change

    USDA-ARS?s Scientific Manuscript database

    Farms both produce greenhouse gas emissions that drive human-induced climate change and are impacted by that climate change. Whole farm and global climate models provide useful tools for studying the benefits and costs of greenhouse gas mitigation and the adaptation of farms to changing climate. The...

  10. Evaluating analytic and risk assessment tools to estimate sediment and nutrients losses from agricultural lands in the southern region of the USA

    USDA-ARS?s Scientific Manuscript database

    Non-point source pollution from agricultural fields is a critical problem associated with water quality impairment in the USA and a low-oxygen environment in the Gulf of Mexico. The use, development and enhancement of qualitative and quantitative models or tools for assessing agricultural runoff qua...

  11. Strategy and gaps for modeling, simulation, and control of hybrid systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rabiti, Cristian; Garcia, Humberto E.; Hovsapian, Rob

    2015-04-01

    The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers,more » and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled, dynamic energy systems requires multiple simulation tools, potentially developed in several programming languages and resolved on separate time scales. Whereas further investigation and development of hybrid concepts will provide a more complete understanding of the joint computational and physical modeling needs, this report highlights areas in which co-simulation capabilities are warranted. The current development status, quality assurance, availability and maintainability of simulation tools that are currently available for hybrid systems modeling is presented. Existing gaps in the modeling and simulation toolsets and development needs are subsequently discussed. This effort will feed into a broader Roadmap activity for designing, developing, and demonstrating hybrid energy systems.« less

  12. Identifying influential data points in hydrological model calibration and their impact on streamflow predictions

    NASA Astrophysics Data System (ADS)

    Wright, David; Thyer, Mark; Westra, Seth

    2015-04-01

    Highly influential data points are those that have a disproportionately large impact on model performance, parameters and predictions. However, in current hydrological modelling practice the relative influence of individual data points on hydrological model calibration is not commonly evaluated. This presentation illustrates and evaluates several influence diagnostics tools that hydrological modellers can use to assess the relative influence of data. The feasibility and importance of including influence detection diagnostics as a standard tool in hydrological model calibration is discussed. Two classes of influence diagnostics are evaluated: (1) computationally demanding numerical "case deletion" diagnostics; and (2) computationally efficient analytical diagnostics, based on Cook's distance. These diagnostics are compared against hydrologically orientated diagnostics that describe changes in the model parameters (measured through the Mahalanobis distance), performance (objective function displacement) and predictions (mean and maximum streamflow). These influence diagnostics are applied to two case studies: a stage/discharge rating curve model, and a conceptual rainfall-runoff model (GR4J). Removing a single data point from the calibration resulted in differences to mean flow predictions of up to 6% for the rating curve model, and differences to mean and maximum flow predictions of up to 10% and 17%, respectively, for the hydrological model. When using the Nash-Sutcliffe efficiency in calibration, the computationally cheaper Cook's distance metrics produce similar results to the case-deletion metrics at a fraction of the computational cost. However, Cooks distance is adapted from linear regression with inherit assumptions on the data and is therefore less flexible than case deletion. Influential point detection diagnostics show great potential to improve current hydrological modelling practices by identifying highly influential data points. The findings of this study establish the feasibility and importance of including influential point detection diagnostics as a standard tool in hydrological model calibration. They provide the hydrologist with important information on whether model calibration is susceptible to a small number of highly influent data points. This enables the hydrologist to make a more informed decision of whether to (1) remove/retain the calibration data; (2) adjust the calibration strategy and/or hydrological model to reduce the susceptibility of model predictions to a small number of influential observations.

  13. Evaluation in undergraduate medical education: Conceptualizing and validating a novel questionnaire for assessing the quality of bedside teaching.

    PubMed

    Dreiling, Katharina; Montano, Diego; Poinstingl, Herbert; Müller, Tjark; Schiekirka-Schwake, Sarah; Anders, Sven; von Steinbüchel, Nicole; Raupach, Tobias

    2017-08-01

    Evaluation is an integral part of curriculum development in medical education. Given the peculiarities of bedside teaching, specific evaluation tools for this instructional format are needed. Development of these tools should be informed by appropriate frameworks. The purpose of this study was to develop a specific evaluation tool for bedside teaching based on the Stanford Faculty Development Program's clinical teaching framework. Based on a literature review yielding 47 evaluation items, an 18-item questionnaire was compiled and subsequently completed by undergraduate medical students at two German universities. Reliability and validity were assessed in an exploratory full information item factor analysis (study one) and a confirmatory factor analysis as well as a measurement invariance analysis (study two). The exploratory analysis involving 824 students revealed a three-factor structure. Reliability estimates of the subscales were satisfactory (α = 0.71-0.84). The model yielded satisfactory fit indices in the confirmatory factor analysis involving 1043 students. The new questionnaire is short and yet based on a widely-used framework for clinical teaching. The analyses presented here indicate good reliability and validity of the instrument. Future research needs to investigate whether feedback generated from this tool helps to improve teaching quality and student learning outcome.

  14. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.

  15. Impact of reduced mass of light commercial vehicles on fuel consumption, CO2 emissions, air quality, and socio-economic costs.

    PubMed

    Cecchel, S; Chindamo, D; Turrini, E; Carnevale, C; Cornacchia, G; Gadola, M; Panvini, A; Volta, M; Ferrario, D; Golimbioschi, R

    2018-02-01

    This study presents a modelling system to evaluate the impact of weight reduction in light commercial vehicles with diesel engines on air quality and greenhouse gas emissions. The PROPS model assesses the emissions of one vehicle in the aforementioned category and its corresponding reduced-weight version. The results serve as an input to the RIAT+ tool, an air quality integrated assessment modelling system. This paper applies the tools in a case study in the Lombardy region (Italy) and discusses the input data pre-processing, the PROPS-RIAT+ modelling system runs, and the results. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Towards improved and more routine Earth system model evaluation in CMIP

    DOE PAGES

    Eyring, Veronika; Gleckler, Peter J.; Heinze, Christoph; ...

    2016-11-01

    The Coupled Model Intercomparison Project (CMIP) has successfully provided the climate community with a rich collection of simulation output from Earth system models (ESMs) that can be used to understand past climate changes and make projections and uncertainty estimates of the future. Confidence in ESMs can be gained because the models are based on physical principles and reproduce many important aspects of observed climate. More research is required to identify the processes that are most responsible for systematic biases and the magnitude and uncertainty of future projections so that more relevant performance tests can be developed. At the same time,more » there are many aspects of ESM evaluation that are well established and considered an essential part of systematic evaluation but have been implemented ad hoc with little community coordination. Given the diversity and complexity of ESM analysis, we argue that the CMIP community has reached a critical juncture at which many baseline aspects of model evaluation need to be performed much more efficiently and consistently. We provide a perspective and viewpoint on how a more systematic, open, and rapid performance assessment of the large and diverse number of models that will participate in current and future phases of CMIP can be achieved, and announce our intention to implement such a system for CMIP6. Accomplishing this could also free up valuable resources as many scientists are frequently "re-inventing the wheel" by re-writing analysis routines for well-established analysis methods. A more systematic approach for the community would be to develop and apply evaluation tools that are based on the latest scientific knowledge and observational reference, are well suited for routine use, and provide a wide range of diagnostics and performance metrics that comprehensively characterize model behaviour as soon as the output is published to the Earth System Grid Federation (ESGF). The CMIP infrastructure enforces data standards and conventions for model output and documentation accessible via the ESGF, additionally publishing observations (obs4MIPs) and reanalyses (ana4MIPs) for model intercomparison projects using the same data structure and organization as the ESM output. This largely facilitates routine evaluation of the ESMs, but to be able to process the data automatically alongside the ESGF, the infrastructure needs to be extended with processing capabilities at the ESGF data nodes where the evaluation tools can be executed on a routine basis. Efforts are already underway to develop community-based evaluation tools, and we encourage experts to provide additional diagnostic codes that would enhance this capability for CMIP. And, at the same time, we encourage the community to contribute observations and reanalyses for model evaluation to the obs4MIPs and ana4MIPs archives. The intention is to produce through the ESGF a widely accepted quasi-operational evaluation framework for CMIP6 that would routinely execute a series of standardized evaluation tasks. Over time, as this capability matures, we expect to produce an increasingly systematic characterization of models which, compared with early phases of CMIP, will more quickly and openly identify the strengths and weaknesses of the simulations. This will also reveal whether long-standing model errors remain evident in newer models and will assist modelling groups in improving their models. Finally, this framework will be designed to readily incorporate updates, including new observations and additional diagnostics and metrics as they become available from the research community.« less

  17. Integrated modeling of advanced optical systems

    NASA Astrophysics Data System (ADS)

    Briggs, Hugh C.; Needels, Laura; Levine, B. Martin

    1993-02-01

    This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.

  18. A Thermal Management Systems Model for the NASA GTX RBCC Concept

    NASA Technical Reports Server (NTRS)

    Traci, Richard M.; Farr, John L., Jr.; Laganelli, Tony; Walker, James (Technical Monitor)

    2002-01-01

    The Vehicle Integrated Thermal Management Analysis Code (VITMAC) was further developed to aid the analysis, design, and optimization of propellant and thermal management concepts for advanced propulsion systems. The computational tool is based on engineering level principles and models. A graphical user interface (GUI) provides a simple and straightforward method to assess and evaluate multiple concepts before undertaking more rigorous analysis of candidate systems. The tool incorporates the Chemical Equilibrium and Applications (CEA) program and the RJPA code to permit heat transfer analysis of both rocket and air breathing propulsion systems. Key parts of the code have been validated with experimental data. The tool was specifically tailored to analyze rocket-based combined-cycle (RBCC) propulsion systems being considered for space transportation applications. This report describes the computational tool and its development and verification for NASA GTX RBCC propulsion system applications.

  19. Designing and encoding models for synthetic biology.

    PubMed

    Endler, Lukas; Rodriguez, Nicolas; Juty, Nick; Chelliah, Vijayalakshmi; Laibe, Camille; Li, Chen; Le Novère, Nicolas

    2009-08-06

    A key component of any synthetic biology effort is the use of quantitative models. These models and their corresponding simulations allow optimization of a system design, as well as guiding their subsequent analysis. Once a domain mostly reserved for experts, dynamical modelling of gene regulatory and reaction networks has been an area of growth over the last decade. There has been a concomitant increase in the number of software tools and standards, thereby facilitating model exchange and reuse. We give here an overview of the model creation and analysis processes as well as some software tools in common use. Using markup language to encode the model and associated annotation, we describe the mining of components, their integration in relational models, formularization and parametrization. Evaluation of simulation results and validation of the model close the systems biology 'loop'.

  20. The Space Environmental Impact System

    NASA Astrophysics Data System (ADS)

    Kihn, E. A.

    2009-12-01

    The Space Environmental Impact System (SEIS) is an operational tool for incorporating environmental data sets into DoD Modeling and Simulation (M&S) which allows for enhanced decision making regarding acquisitions, testing, operations and planning. The SEIS system creates, from the environmental archives and developed rule-base, a tool for describing the effects of the space environment on particular military systems, both historically and in real-time. The system uses data available over the web, and in particular data provided by NASA’s virtual observatory network, as well as modeled data generated specifically for this purpose. The rule base system developed to support SEIS is an open XML based model which can be extended to events from any environmental domain. This presentation will show how the SEIS tool allows users to easily and accurately evaluate the effect of space weather in terms that are meaningful to them as well as discuss the relevant standards used in its construction and go over lessons learned from fielding an operational environmental decision tool.

  1. Airport Viz - a 3D Tool to Enhance Security Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Daniel B

    2006-01-01

    In the summer of 2000, the National Safe Skies Alliance (NSSA) awarded a project to the Applied Visualization Center (AVC) at the University of Tennessee, Knoxville (UTK) to develop a 3D computer tool to assist the Federal Aviation Administration security group, now the Transportation Security Administration (TSA), in evaluating new equipment and procedures to improve airport checkpoint security. A preliminary tool was demonstrated at the 2001 International Aviation Security Technology Symposium. Since then, the AVC went on to construct numerous detection equipment models as well as models of several airports. Airport Viz has been distributed by the NSSA to amore » number of airports around the country which are able to incorporate their own CAD models into the software due to its unique open architecture. It provides a checkpoint design and passenger flow simulation function, a layout design and simulation tool for checked baggage and cargo screening, and a means to assist in the vulnerability assessment of airport access points for pedestrians and vehicles.« less

  2. Factors that affect micro-tooling features created by direct printing approach

    NASA Astrophysics Data System (ADS)

    Kumbhani, Mayur N.

    Current market required faster pace production of smaller, better, and improved products in shorter amount of time. Traditional high-rate manufacturing process such as hot embossing, injection molding, compression molding, etc. use tooling to replicate feature on a products. Miniaturization of many product in the field of biomedical, electronics, optical, and microfluidic is occurring on a daily bases. There is a constant need to produce cheaper, and faster tooling, which can be utilize by existing manufacturing processes. Traditionally, in order to manufacture micron size tooling features processes such as micro-machining, Electrical Discharge Machining (EDM), etc. are utilized. Due to a higher difficulty to produce smaller size features, and longer production cycle time, various additive manufacturing approaches are proposed, e.g. selective laser sintering (SLS), inkjet printing (3DP), fused deposition modeling (FDM), etc. were proposed. Most of these approaches can produce net shaped products from different materials such as metal, ceramic, or polymers. Several attempts were made to produce tooling features using additive manufacturing approaches. Most of these produced tooling were not cost effective, and the life cycle of these tooling was reported short. In this research, a method to produce tooling features using direct printing approach, where highly filled feedstock was dispensed on a substrate. This research evaluated different natural binders, such as guar gum, xanthan gum, and sodium carboxymethyl cellulose (NaCMC) and their combinations were evaluated. The best binder combination was then use to evaluate effect of different metal (316L stainless steel (3 mum), 316 stainless steel (45 mum), and 304 stainless steel (45 mum)) particle size on feature quality. Finally, the effect of direct printing process variables such as dispensing tip internal diameter (500 mum, and 333 mum) at different printing speeds were evaluated.

  3. Rat aorta as a pharmacological tool for in vitro and in vivo studies.

    PubMed

    Rameshrad, Maryam; Babaei, Hossein; Azarmi, Yadollah; Fouladi, Daniel Fadaei

    2016-01-15

    Rat aorta assay provides a low cost and rapid platform, especially for preclinical in vivo models. The signaling pathways of the analog on the vessels could be evaluated separately on the endothelium or smooth muscle cells by rings of the rat aorta in vitro. The rat aorta is used for angiogenesis modeling to integrate the benefits of the both in vivo and in vitro models. These explain the importance and usage of rat aorta in researches. Furthermore, about 4503 articles have been published with the key word "rat aorta" in title or abstract from 1955 until the end of 2013 in Medline. In this review, these articles were organized into two main categories: in vivo and in vitro studies. The in vitro section focused on the rat aorta model, as a tool for evaluate the mechanism of vasodilation, vasoconstriction and angiogenesis. In the in vivo section, the most important usage of this tissue was evaluated. Also, the vasotonic signaling pathways in the vessel are explained briefly and some rat aorta applications in vitro and in vivo have been discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Development and weighting of a life cycle assessment screening model

    NASA Astrophysics Data System (ADS)

    Bates, Wayne E.; O'Shaughnessy, James; Johnson, Sharon A.; Sisson, Richard

    2004-02-01

    Nearly all life cycle assessment tools available today are high priced, comprehensive and quantitative models requiring a significant amount of data collection and data input. In addition, most of the available software packages require a great deal of training time to learn how to operate the model software. Even after this time investment, results are not guaranteed because of the number of estimations and assumptions often necessary to run the model. As a result, product development, design teams and environmental specialists need a simplified tool that will allow for the qualitative evaluation and "screening" of various design options. This paper presents the development and design of a generic, qualitative life cycle screening model and demonstrates its applicability and ease of use. The model uses qualitative environmental, health and safety factors, based on site or product-specific issues, to sensitize the overall results for a given set of conditions. The paper also evaluates the impact of different population input ranking values on model output. The final analysis is based on site or product-specific variables. The user can then evaluate various design changes and the apparent impact or improvement on the environment, health and safety, compliance cost and overall corporate liability. Major input parameters can be varied, and factors such as materials use, pollution prevention, waste minimization, worker safety, product life, environmental impacts, return of investment, and recycle are evaluated. The flexibility of the model format will be discussed in order to demonstrate the applicability and usefulness within nearly any industry sector. Finally, an example using audience input value scores will be compared to other population input results.

  5. Sensitivity of simulated conservation practice effectiveness to representation of field and in-stream processes in the Little River Watershed

    USDA-ARS?s Scientific Manuscript database

    Evaluating the effectiveness of conservation practices (CPs) is an important step to achieving efficient and successful water quality management. Watershed-scale simulation models can provide useful and convenient tools for this evaluation, but simulated conservation practice effectiveness should be...

  6. Designing and Evaluating Representations to Model Pedagogy

    ERIC Educational Resources Information Center

    Masterman, Elizabeth; Craft, Brock

    2013-01-01

    This article presents the case for a theory-informed approach to designing and evaluating representations for implementation in digital tools to support Learning Design, using the framework of epistemic efficacy as an example. This framework, which is rooted in the literature of cognitive psychology, is operationalised through dimensions of fit…

  7. Petrophysical evaluation of subterranean formations

    DOEpatents

    Klein, James D; Schoderbek, David A; Mailloux, Jason M

    2013-05-28

    Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.

  8. Development of Response Surface Models for Rapid Analysis & Multidisciplinary Optimization of Launch Vehicle Design Concepts

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1999-01-01

    Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.

  9. Image analysis-based modelling for flower number estimation in grapevine.

    PubMed

    Millan, Borja; Aquino, Arturo; Diago, Maria P; Tardaguila, Javier

    2017-02-01

    Grapevine flower number per inflorescence provides valuable information that can be used for assessing yield. Considerable research has been conducted at developing a technological tool, based on image analysis and predictive modelling. However, the behaviour of variety-independent predictive models and yield prediction capabilities on a wide set of varieties has never been evaluated. Inflorescence images from 11 grapevine Vitis vinifera L. varieties were acquired under field conditions. The flower number per inflorescence and the flower number visible in the images were calculated manually, and automatically using an image analysis algorithm. These datasets were used to calibrate and evaluate the behaviour of two linear (single-variable and multivariable) and a nonlinear variety-independent model. As a result, the integrated tool composed of the image analysis algorithm and the nonlinear approach showed the highest performance and robustness (RPD = 8.32, RMSE = 37.1). The yield estimation capabilities of the flower number in conjunction with fruit set rate (R 2  = 0.79) and average berry weight (R 2  = 0.91) were also tested. This study proves the accuracy of flower number per inflorescence estimation using an image analysis algorithm and a nonlinear model that is generally applicable to different grapevine varieties. This provides a fast, non-invasive and reliable tool for estimation of yield at harvest. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  10. General MACOS Interface for Modeling and Analysis for Controlled Optical Systems

    NASA Technical Reports Server (NTRS)

    Sigrist, Norbert; Basinger, Scott A.; Redding, David C.

    2012-01-01

    The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.

  11. A Conceptual Framework for Evaluating the Domains of Applicability of Ecological Models and its Implementation in the Ecological Production Function Library - International Society for Ecological Modelling Conference

    EPA Science Inventory

    The use of computational ecological models to inform environmental management and policy has proliferated in the past 25 years. These models have become essential tools as linkages and feedbacks between human actions and ecological responses can be complex, and as funds for sampl...

  12. Fire and Smoke Model Evaluation Experiment (FASMEE): Modeling gaps and data needs

    Treesearch

    Yongqiang Liu; Adam Kochanski; Kirk Baker; Ruddy Mell; Rodman Linn; Ronan Paugam; Jan Mandel; Aime Fournier; Mary Ann Jenkins; Scott Goodrick; Gary Achtemeier; Andrew Hudak; Matthew Dickson; Brian Potter; Craig Clements; Shawn Urbanski; Roger Ottmar; Narasimhan Larkin; Timothy Brown; Nancy French; Susan Prichard; Adam Watts; Derek McNamara

    2017-01-01

    Fire and smoke models are numerical tools for simulating fire behavior, smoke dynamics, and air quality impacts of wildland fires. Fire models are developed based on the fundamental chemistry and physics of combustion and fire spread or statistical analysis of experimental data (Sullivan 2009). They provide information on fire spread and fuel consumption for safe and...

  13. Evaluation of a Tactical Surface Metering Tool for Charlotte Douglas International Airport via Human-in-the-Loop Simulation

    NASA Technical Reports Server (NTRS)

    Verma, Savita; Lee, Hanbong; Dulchinos, Victoria L.; Martin, Lynne; Stevens, Lindsay; Jung, Yoon; Chevalley, Eric; Jobe, Kim; Parke, Bonny

    2017-01-01

    NASA has been working with the FAA and aviation industry partners to develop and demonstrate new concepts and technologies that integrate arrival, departure, and surface traffic management capabilities. In March 2017, NASA conducted a human-in-the-loop (HITL) simulation for integrated surface and airspace operations, modeling Charlotte Douglas International Airport, to evaluate the operational procedures and information requirements for the tactical surface metering tool, and data exchange elements between the airline controlled ramp and ATC Tower. In this paper, we focus on the calibration of the tactical surface metering tool using various metrics measured from the HITL simulation results. Key performance metrics include gate hold times from pushback advisories, taxi-in-out times, runway throughput, and departure queue size. Subjective metrics presented in this paper include workload, situational awareness, and acceptability of the metering tool and its calibration.

  14. Evaluation of a Tactical Surface Metering Tool for Charlotte Douglas International Airport Via Human-in-the-Loop Simulation

    NASA Technical Reports Server (NTRS)

    Verma, Savita; Lee, Hanbong; Martin, Lynne; Stevens, Lindsay; Jung, Yoon; Dulchinos, Victoria; Chevalley, Eric; Jobe, Kim; Parke, Bonny

    2017-01-01

    NASA has been working with the FAA and aviation industry partners to develop and demonstrate new concepts and technologies that integrate arrival, departure, and surface traffic management capabilities. In March 2017, NASA conducted a human-in-the-loop (HITL) simulation for integrated surface and airspace operations, modeling Charlotte Douglas International Airport, to evaluate the operational procedures and information requirements for the tactical surface metering tool, and data exchange elements between the airline controlled ramp and ATC Tower. In this paper, we focus on the calibration of the tactical surface metering tool using various metrics measured from the HITL simulation results. Key performance metrics include gate hold times from pushback advisories, taxi-in/out times, runway throughput, and departure queue size. Subjective metrics presented in this paper include workload, situational awareness, and acceptability of the metering tool and its calibration

  15. Forward impact extrusion of surface textured steel blanks using coated tooling

    NASA Astrophysics Data System (ADS)

    Hild, Rafael; Feuerhack, Andreas; Trauth, Daniel; Arghavani, Mostafa; Kruppe, Nathan C.; Brögelmann, Tobias; Bobzin, Kirsten; Klocke, Fritz

    2017-10-01

    A method to enable dry metal forming by the means of a self-lubricating coating and surface textures was researched using an innovative Pin-On-Cylinder-Tribometer. The experimental analysis was complemented by a numerical model of the complex contact conditions between coated tools and the surface textured specimen at the micro-level. Based on the results, the explanation of the tribological interactions between surface textured specimens and the tool in dry full forward extrusion is the objective of this work. Therefore, experimental dry extrusion tests were performed using a tool system. The extruded specimens were evaluated regarding their geometry as well as by the required punch force. Thereby, the effectiveness and the feasibility of dry metal forming on the example of full forward extrusion was evaluated. Thus, one more step towards the technical realization of dry metal forming of low alloy steels under industrial conditions was realized.

  16. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey; Zinnecker, Alicia

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000(CMAPSS40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLABSimulink (The MathWorks, Inc.) environment.

  17. Application of the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) for Dynamic Systems Analysis

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey Thomas; Zinnecker, Alicia Mae

    2014-01-01

    Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS 40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLAB Simulink (The MathWorks, Inc.) environment.

  18. Evaluating 3D-printed biomaterials as scaffolds for vascularized bone tissue engineering.

    PubMed

    Wang, Martha O; Vorwald, Charlotte E; Dreher, Maureen L; Mott, Eric J; Cheng, Ming-Huei; Cinar, Ali; Mehdizadeh, Hamidreza; Somo, Sami; Dean, David; Brey, Eric M; Fisher, John P

    2015-01-07

    There is an unmet need for a consistent set of tools for the evaluation of 3D-printed constructs. A toolbox developed to design, characterize, and evaluate 3D-printed poly(propylene fumarate) scaffolds is proposed for vascularized engineered tissues. This toolbox combines modular design and non-destructive fabricated design evaluation, evaluates biocompatibility and mechanical properties, and models angiogenesis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Evaluation and Analysis of F-16XL Wind Tunnel Data From Static and Dynamic Tests

    NASA Technical Reports Server (NTRS)

    Kim, Sungwan; Murphy, Patrick C.; Klein, Vladislav

    2004-01-01

    A series of wind tunnel tests were conducted in the NASA Langley Research Center as part of an ongoing effort to develop and test mathematical models for aircraft rigid-body aerodynamics in nonlinear unsteady flight regimes. Analysis of measurement accuracy, especially for nonlinear dynamic systems that may exhibit complicated behaviors, is an essential component of this ongoing effort. In this report, tools for harmonic analysis of dynamic data and assessing measurement accuracy are presented. A linear aerodynamic model is assumed that is appropriate for conventional forced-oscillation experiments, although more general models can be used with these tools. Application of the tools to experimental data is demonstrated and results indicate the levels of uncertainty in output measurements that can arise from experimental setup, calibration procedures, mechanical limitations, and input errors.

  20. Accurate Behavioral Simulator of All-Digital Time-Domain Smart Temperature Sensors by Using SIMULINK

    PubMed Central

    Chen, Chun-Chi; Chen, Chao-Lieh; Lin, You-Ting

    2016-01-01

    This study proposes a new behavioral simulator that uses SIMULINK for all-digital CMOS time-domain smart temperature sensors (TDSTSs) for performing rapid and accurate simulations. Inverter-based TDSTSs offer the benefits of low cost and simple structure for temperature-to-digital conversion and have been developed. Typically, electronic design automation tools, such as HSPICE, are used to simulate TDSTSs for performance evaluations. However, such tools require extremely long simulation time and complex procedures to analyze the results and generate figures. In this paper, we organize simple but accurate equations into a temperature-dependent model (TDM) by which the TDSTSs evaluate temperature behavior. Furthermore, temperature-sensing models of a single CMOS NOT gate were devised using HSPICE simulations. Using the TDM and these temperature-sensing models, a novel simulator in SIMULINK environment was developed to substantially accelerate the simulation and simplify the evaluation procedures. Experiments demonstrated that the simulation results of the proposed simulator have favorable agreement with those obtained from HSPICE simulations, showing that the proposed simulator functions successfully. This is the first behavioral simulator addressing the rapid simulation of TDSTSs. PMID:27509507

  1. Research evaluation support services in biomedical libraries.

    PubMed

    Gutzman, Karen Elizabeth; Bales, Michael E; Belter, Christopher W; Chambers, Thane; Chan, Liza; Holmes, Kristi L; Lu, Ya-Ling; Palmer, Lisa A; Reznik-Zellen, Rebecca C; Sarli, Cathy C; Suiter, Amy M; Wheeler, Terrie R

    2018-01-01

    The paper provides a review of current practices related to evaluation support services reported by seven biomedical and research libraries. A group of seven libraries from the United States and Canada described their experiences with establishing evaluation support services at their libraries. A questionnaire was distributed among the libraries to elicit information as to program development, service and staffing models, campus partnerships, training, products such as tools and reports, and resources used for evaluation support services. The libraries also reported interesting projects, lessons learned, and future plans. The seven libraries profiled in this paper report a variety of service models in providing evaluation support services to meet the needs of campus stakeholders. The service models range from research center cores, partnerships with research groups, and library programs with staff dedicated to evaluation support services. A variety of products and services were described such as an automated tool to develop rank-based metrics, consultation on appropriate metrics to use for evaluation, customized publication and citation reports, resource guides, classes and training, and others. Implementing these services has allowed the libraries to expand their roles on campus and to contribute more directly to the research missions of their institutions. Libraries can leverage a variety of evaluation support services as an opportunity to successfully meet an array of challenges confronting the biomedical research community, including robust efforts to report and demonstrate tangible and meaningful outcomes of biomedical research and clinical care. These services represent a transformative direction that can be emulated by other biomedical and research libraries.

  2. Tools for Local and Distributed Climate Data Access

    NASA Astrophysics Data System (ADS)

    Schweitzer, R.; O'Brien, K.; Burger, E. F.; Smith, K. M.; Manke, A. B.; Radhakrishnan, A.; Balaji, V.

    2017-12-01

    Last year we reported on our efforts to adapt existing tools to facilitate model development. During the lifecycle of a Climate Model Intercomparison Project (CMIP), data must be quality controlled before it can be published and studied. Like previous efforts, the next CMIP6 will produce an unprecedented volume of data. For an institution, modelling group or modeller the volume of data is unmanageable without tools that organize and automate as many processes as possible. Even if a modelling group has tools for data and metadata management, it often falls on individuals to do the initial quality assessment for a model run with bespoke tools. Using individually crafted tools can lead to interruptions when project personnel change and may result in inconsistencies and duplication of effort across groups. This talk will expand on our experiences using available tools (Ferret/PyFerret, the Live Access Server, the GFDL Curator, the GFDL Model Development Database Interface and the THREDDS Data Server) to seamlessly automate the data assembly process to give users "one-click" access to a rich suite of Web-based analysis and comparison tools. On the surface, it appears that this collection of tools is well suited to the task, but our experience of the last year taught us that the data volume and distributed storage adds a number of challenges in adapting the tools for this task. Quality control and initial evaluation add their own set of challenges. We will discuss how we addressed the needs of QC researchers by expanding standard tools to include specialized plots and leveraged the configurability of the tools to add specific user defined analysis operations so they are available to everyone using the system. We also report on our efforts to overcome some of the technical barriers for wide adoption of the tools by providing pre-built containers that are easily deployed in virtual machine and cloud environments. Finally, we will offer some suggestions for added features, configuration options and improved robustness that can make future implementation of similar systems operate faster and more reliably. Solving these challenges for data sets distributed narrowly across networks and storage systems of points the way to solving similar problems associated with sharing data distributed across institutions continents.

  3. Enroute flight planning: Evaluating design concepts for the development of cooperative problem-solving concepts

    NASA Technical Reports Server (NTRS)

    Smith, Philip J.; Mccoy, C. Elaine

    1991-01-01

    The goals of this research were to develop design concepts to support the task of enroute flight planning. And within this context, to explore and evaluate general design concepts and principles to guide the development of cooperative problem solving systems. A detailed model is to be developed of the cognitive processes involved in flight planning. Included in this model will be the identification of individual differences of subjects. Of particular interest will be differences between pilots and dispatchers. The effect will be studied of the effect on performance of tools that support planning at different levels of abstraction. In order to conduct this research, the Flight Planning Testbed (FPT) was developed, a fully functional testbed environment for studying advanced design concepts for tools to aid in flight planning.

  4. High Reliability Organizations--Medication Safety.

    PubMed

    Yip, Luke; Farmer, Brenna

    2015-06-01

    High reliability organizations (HROs), such as the aviation industry, successfully engage in high-risk endeavors and have low incidence of adverse events. HROs have a preoccupation with failure and errors. They analyze each event to effect system wide change in an attempt to mitigate the occurrence of similar errors. The healthcare industry can adapt HRO practices, specifically with regard to teamwork and communication. Crew resource management concepts can be adapted to healthcare with the use of certain tools such as checklists and the sterile cockpit to reduce medication errors. HROs also use The Swiss Cheese Model to evaluate risk and look for vulnerabilities in multiple protective barriers, instead of focusing on one failure. This model can be used in medication safety to evaluate medication management in addition to using the teamwork and communication tools of HROs.

  5. Membrane Packing Problems: A short Review on computational Membrane Modeling Methods and Tools

    PubMed Central

    Sommer, Björn

    2013-01-01

    The use of model membranes is currently part of the daily workflow for many biochemical and biophysical disciplines. These membranes are used to analyze the behavior of small substances, to simulate transport processes, to study the structure of macromolecules or for illustrative purposes. But, how can these membrane structures be generated? This mini review discusses a number of ways to obtain these structures. First, the problem will be formulated as the Membrane Packing Problem. It will be shown that the theoretical problem of placing proteins and lipids onto a membrane area differ significantly. Thus, two sub-problems will be defined and discussed. Then, different – partly historical – membrane modeling methods will be introduced. And finally, membrane modeling tools will be evaluated which are able to semi-automatically generate these model membranes and thus, drastically accelerate and simplify the membrane generation process. The mini review concludes with advice about which tool is appropriate for which application case. PMID:24688707

  6. A dynamic simulation based water resources education tool.

    PubMed

    Williams, Alison; Lansey, Kevin; Washburne, James

    2009-01-01

    Educational tools to assist the public in recognizing impacts of water policy in a realistic context are not generally available. This project developed systems with modeling-based educational decision support simulation tools to satisfy this need. The goal of this model is to teach undergraduate students and the general public about the implications of common water management alternatives so that they can better understand or become involved in water policy and make more knowledgeable personal or community decisions. The model is based on Powersim, a dynamic simulation software package capable of producing web-accessible, intuitive, graphic, user-friendly interfaces. Modules are included to represent residential, agricultural, industrial, and turf uses, as well as non-market values, water quality, reservoir, flow, and climate conditions. Supplementary materials emphasize important concepts and lead learners through the model, culminating in an open-ended water management project. The model is used in a University of Arizona undergraduate class and within the Arizona Master Watershed Stewards Program. Evaluation results demonstrated improved understanding of concepts and system interactions, fulfilling the project's objectives.

  7. Process-level improvements in CMIP5 models and their impact on tropical variability, the Southern Ocean, and monsoons

    NASA Astrophysics Data System (ADS)

    Lauer, Axel; Jones, Colin; Eyring, Veronika; Evaldsson, Martin; Hagemann, Stefan; Mäkelä, Jarmo; Martin, Gill; Roehrig, Romain; Wang, Shiyu

    2018-01-01

    The performance of updated versions of the four earth system models (ESMs) CNRM, EC-Earth, HadGEM, and MPI-ESM is assessed in comparison to their predecessor versions used in Phase 5 of the Coupled Model Intercomparison Project. The Earth System Model Evaluation Tool (ESMValTool) is applied to evaluate selected climate phenomena in the models against observations. This is the first systematic application of the ESMValTool to assess and document the progress made during an extensive model development and improvement project. This study focuses on the South Asian monsoon (SAM) and the West African monsoon (WAM), the coupled equatorial climate, and Southern Ocean clouds and radiation, which are known to exhibit systematic biases in present-day ESMs. The analysis shows that the tropical precipitation in three out of four models is clearly improved. Two of three updated coupled models show an improved representation of tropical sea surface temperatures with one coupled model not exhibiting a double Intertropical Convergence Zone (ITCZ). Simulated cloud amounts and cloud-radiation interactions are improved over the Southern Ocean. Improvements are also seen in the simulation of the SAM and WAM, although systematic biases remain in regional details and the timing of monsoon rainfall. Analysis of simulations with EC-Earth at different horizontal resolutions from T159 up to T1279 shows that the synoptic-scale variability in precipitation over the SAM and WAM regions improves with higher model resolution. The results suggest that the reasonably good agreement of modeled and observed mean WAM and SAM rainfall in lower-resolution models may be a result of unrealistic intensity distributions.

  8. Variation simulation for compliant sheet metal assemblies with applications

    NASA Astrophysics Data System (ADS)

    Long, Yufeng

    Sheet metals are widely used in discrete products, such as automobiles, aircraft, furniture and electronics appliances, due to their good manufacturability and low cost. A typical automotive body assembly consists of more than 300 parts welded together in more than 200 assembly fixture stations. Such an assembly system is usually quite complex, and takes a long time to develop. As the automotive customer demands products of increasing quality in a shorter time, engineers in automotive industry turn to computer-aided engineering (CAE) tools for help. Computers are an invaluable resource for engineers, not only to simplify and automate the design process, but also to share design specifications with manufacturing groups so that production systems can be tooled up quickly and efficiently. Therefore, it is beneficial to develop computerized simulation and evaluation tools for development of automotive body assembly systems. It is a well-known fact that assembly architectures (joints, fixtures, and assembly lines) have a profound impact on dimensional quality of compliant sheet metal assemblies. To evaluate sheet metal assembly architectures, a special dimensional analysis tool need be developed for predicting dimensional variation of the assembly. Then, the corresponding systematic tools can be established to help engineers select the assembly architectures. In this dissertation, a unified variation model is developed to predict variation in compliant sheet metal assemblies by considering fixture-induced rigid-body motion, deformation and springback. Based on the unified variation model, variation propagation models in multiple assembly stations with various configurations are established. To evaluate the dimensional capability of assembly architectures, quantitative indices are proposed based on the sensitivity matrix, which are independent of the variation level of the process. Examples are given to demonstrate their applications in selecting robust assembly architectures, and some useful guidelines for selection of assembly architectures are summarized. In addition, to enhance the fault diagnosis, a systematic methodology is proposed for selection of measurement configurations. Specifically, principles involved in selecting measurements are generalized first; then, the corresponding quantitative indices are developed to evaluate the measurement configurations, and finally, examples are present.

  9. Superlative Model Using Word Cloud for Short Answers Evaluation in eLearning

    ERIC Educational Resources Information Center

    Jayashankar, Shailaja; Sridaran, R.

    2017-01-01

    Teachers are thrown open to abundance of free text answers which are very daunting to read and evaluate. Automatic assessments of open ended answers have been attempted in the past but none guarantees 100% accuracy. In order to deal with the overload involved in this manual evaluation, a new tool becomes necessary. The unique superlative model…

  10. Combustor liner durability analysis

    NASA Technical Reports Server (NTRS)

    Moreno, V.

    1981-01-01

    An 18 month combustor liner durability analysis program was conducted to evaluate the use of advanced three dimensional transient heat transfer and nonlinear stress-strain analyses for modeling the cyclic thermomechanical response of a simulated combustor liner specimen. Cyclic life prediction technology for creep/fatigue interaction is evaluated for a variety of state-of-the-art tools for crack initiation and propagation. The sensitivity of the initiation models to a change in the operating conditions is also assessed.

  11. Human performance cognitive-behavioral modeling: a benefit for occupational safety.

    PubMed

    Gore, Brian F

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  12. FY09 Final Report for LDRD Project: Understanding Viral Quasispecies Evolution through Computation and Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, C

    2009-11-12

    In FY09 they will (1) complete the implementation, verification, calibration, and sensitivity and scalability analysis of the in-cell virus replication model; (2) complete the design of the cell culture (cell-to-cell infection) model; (3) continue the research, design, and development of their bioinformatics tools: the Web-based structure-alignment-based sequence variability tool and the functional annotation of the genome database; (4) collaborate with the University of California at San Francisco on areas of common interest; and (5) submit journal articles that describe the in-cell model with simulations and the bioinformatics approaches to evaluation of genome variability and fitness.

  13. Human performance cognitive-behavioral modeling: a benefit for occupational safety

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2002-01-01

    Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.

  14. Modeling of Passive Forces of Machine Tool Covers

    NASA Astrophysics Data System (ADS)

    Kolar, Petr; Hudec, Jan; Sulitka, Matej

    The passive forces acting against the drive force are phenomena that influence dynamical properties and precision of linear axes equipped with feed drives. Covers are one of important sources of passive forces in machine tools. The paper describes virtual evaluation of cover passive forces using the cover complex model. The model is able to compute interaction between flexible cover segments and sealing wiper. The result is deformation of cover segments and wipers which is used together with measured friction coefficient for computation of cover total passive force. This resulting passive force is dependent on cover position. Comparison of computational results and measurement on the real cover is presented in the paper.

  15. Development of a program logic model and evaluation plan for a participatory ergonomics intervention in construction.

    PubMed

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2014-03-01

    Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. © 2013 Wiley Periodicals, Inc.

  16. Development of a Program Logic Model and Evaluation Plan for a Participatory Ergonomics Intervention in Construction

    PubMed Central

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2013-01-01

    Background Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. Methods In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. Results The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Conclusions Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. PMID:24006097

  17. Did you have an impact? A theory-based method for planning and evaluating knowledge-transfer and exchange activities in occupational health and safety.

    PubMed

    Kramer, Desré M; Wells, Richard P; Carlan, Nicolette; Aversa, Theresa; Bigelow, Philip P; Dixon, Shane M; McMillan, Keith

    2013-01-01

    Few evaluation tools are available to assess knowledge-transfer and exchange interventions. The objective of this paper is to develop and demonstrate a theory-based knowledge-transfer and exchange method of evaluation (KEME) that synthesizes 3 theoretical frameworks: the promoting action on research implementation of health services (PARiHS) model, the transtheoretical model of change, and a model of knowledge use. It proposes a new term, keme, to mean a unit of evidence-based transferable knowledge. The usefulness of the evaluation method is demonstrated with 4 occupational health and safety knowledge transfer and exchange (KTE) implementation case studies that are based upon the analysis of over 50 pre-existing interviews. The usefulness of the evaluation model has enabled us to better understand stakeholder feedback, frame our interpretation, and perform a more comprehensive evaluation of the knowledge use outcomes of our KTE efforts.

  18. Impact of Parameter Uncertainty Assessment of Critical SWAT Output Simulations

    USDA-ARS?s Scientific Manuscript database

    Watershed models are increasingly being utilized to evaluate alternate management scenarios for improving water quality. The concern for using these tools in extensive programs such as the National Total Maximum Daily Load (TMDL) program is that the certainty of model results and efficacy of managem...

  19. Evaluation of transportation/air quality model improvements based on TOTEMS on-road driving style and tailpipe emissions data.

    DOT National Transportation Integrated Search

    2014-06-01

    In June 2012, the Environmental Protection Agency (EPA) released the Operating Mode : Distribution Generator (OMDG) a tool for developing an operating mode distribution as an input : to the Motor Vehicle Emissions Simulator model (MOVES). The t...

  20. Evaluating hydrological model performance using information theory-based metrics

    USDA-ARS?s Scientific Manuscript database

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  1. Using WNTR to Model Water Distribution System Resilience

    EPA Science Inventory

    The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of di...

  2. Verification of NWP Cloud Properties using A-Train Satellite Observations

    NASA Astrophysics Data System (ADS)

    Kucera, P. A.; Weeks, C.; Wolff, C.; Bullock, R.; Brown, B.

    2011-12-01

    Recently, the NCAR Model Evaluation Tools (MET) has been enhanced to incorporate satellite observations for the verification of Numerical Weather Prediction (NWP) cloud products. We have developed tools that match fields spatially (both in the vertical and horizontal dimensions) to compare NWP products with satellite observations. These matched fields provide diagnostic evaluation of cloud macro attributes such as vertical distribution of clouds, cloud top height, and the spatial and seasonal distribution of cloud fields. For this research study, we have focused on using CloudSat, CALIPSO, and MODIS observations to evaluate cloud fields for a variety of NWP fields and derived products. We have selected cases ranging from large, mid-latitude synoptic systems to well-organized tropical cyclones. For each case, we matched the observed cloud field with gridded model and/or derived product fields. CloudSat and CALIPSO observations and model fields were matched and compared in the vertical along the orbit track. MODIS data and model fields were matched and compared in the horizontal. We then use MET to compute the verification statistics to quantify the performance of the models in representing the cloud fields. In this presentation we will give a summary of our comparison and show verification results for both synoptic and tropical cyclone cases.

  3. Evaluation of modeling for groundwater flow and tetrachloroethylene transport in the Milford-Souhegan glacial-drift aquifer at the Savage Municipal Well Superfund site, Milford, New Hampshire, 2011

    USGS Publications Warehouse

    Harte, Philip T.

    2012-01-01

    The U.S. Geological Survey and the New Hampshire Department of Environmental Services entered into a cooperative agreement to assist in the evaluation of remedy simulations of the MSGD aquifer that are being performed by various parties to track the remedial progress of the PCE plume. This report summarizes findings from this evaluation. Topics covered include description of groundwater flow and transport models used in the study of the Savage Superfund site (section 2), evaluation of models and their results (section 3), testing of several new simulations (section 4), an assessment of the representation of models to simulate field conditions (section 5), and an assessment of models as a tool in remedial operational decision making (section 6).

  4. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    USGS Publications Warehouse

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource managers to inform their decision-making processes; however, as with all population models, caution is needed, and a full understanding of the limitations of a model and the veracity of user-supplied parameters should always be considered when using such model output in the management of any species.

  5. Simulating Soil Organic Matter with CQESTR (v.2.0): Model Description and Validation against Long-term Experiments across North America

    USDA-ARS?s Scientific Manuscript database

    Soil carbon (C) models are important tools for examining complex interactions between climate, crop and soil management practices, and to evaluate the long-term effects of management practices on C-storage potential in soils. CQESTR is a process-based carbon balance model that relates crop residue a...

  6. Construct Validation of the Louisiana School Analysis Model (SAM) Instructional Staff Questionnaire

    ERIC Educational Resources Information Center

    Bray-Clark, Nikki; Bates, Reid

    2005-01-01

    The purpose of this study was to validate the Louisiana SAM Instructional Staff Questionnaire, a key component of the Louisiana School Analysis Model. The model was designed as a comprehensive evaluation tool for schools. Principle axis factoring with oblique rotation was used to uncover the underlying structure of the SISQ. (Contains 1 table.)

  7. A Conceptual Framework for Evaluating the Domains of Applicability of Ecological Models and its Implementation in the Ecological Production Function Library

    EPA Science Inventory

    The use of computational ecological models to inform environmental management and policy has proliferated in the past 25 years. These models have become essential tools as linkages and feedbacks between human actions and ecological responses can be complex, and as funds for sampl...

  8. Computational Tools for Probing Interactions in Multiple Linear Regression, Multilevel Modeling, and Latent Curve Analysis

    ERIC Educational Resources Information Center

    Preacher, Kristopher J.; Curran, Patrick J.; Bauer, Daniel J.

    2006-01-01

    Simple slopes, regions of significance, and confidence bands are commonly used to evaluate interactions in multiple linear regression (MLR) models, and the use of these techniques has recently been extended to multilevel or hierarchical linear modeling (HLM) and latent curve analysis (LCA). However, conducting these tests and plotting the…

  9. Space station interior noise analysis program

    NASA Technical Reports Server (NTRS)

    Stusnick, E.; Burn, M.

    1987-01-01

    Documentation is provided for a microcomputer program which was developed to evaluate the effect of the vibroacoustic environment on speech communication inside a space station. The program, entitled Space Station Interior Noise Analysis Program (SSINAP), combines a Statistical Energy Analysis (SEA) prediction of sound and vibration levels within the space station with a speech intelligibility model based on the Modulation Transfer Function and the Speech Transmission Index (MTF/STI). The SEA model provides an effective analysis tool for predicting the acoustic environment based on proposed space station design. The MTF/STI model provides a method for evaluating speech communication in the relatively reverberant and potentially noisy environments that are likely to occur in space stations. The combinations of these two models provides a powerful analysis tool for optimizing the acoustic design of space stations from the point of view of speech communications. The mathematical algorithms used in SSINAP are presented to implement the SEA and MTF/STI models. An appendix provides an explanation of the operation of the program along with details of the program structure and code.

  10. A procedural model for planning and evaluating behavioral interventions.

    PubMed

    Hyner, G C

    2005-01-01

    A model for planning, implementing and evaluating health behavior change strategies is proposed. Variables are presented which can be used in the model or serve as examples for how the model is utilized once a theory of health behavior is adopted. Examples of three innovative strategies designed to influence behavior change are presented so that the proposed model can be modified for use following comprehensive screening and baseline measurements. Three measurement priorities: clients, methods and agency are subjected to three phases of assessment: goals, implementation and effects. Lifestyles account for the majority of variability in quality-of-life and premature morbidity and mortality. Interventions designed to influence healthy behavior changes must be driven by theory and carefully planned and evaluated. The proposed model is offered as a useful tool for the behavior change strategist.

  11. Construct validity of the ovine model in endoscopic sinus surgery training.

    PubMed

    Awad, Zaid; Taghi, Ali; Sethukumar, Priya; Tolley, Neil S

    2015-03-01

    To demonstrate construct validity of the ovine model as a tool for training in endoscopic sinus surgery (ESS). Prospective, cross-sectional evaluation study. Over 18 consecutive months, trainees and experts were evaluated in their ability to perform a range of tasks (based on previous face validation and descriptive studies conducted by the same group) relating to ESS on the sheep-head model. Anonymized randomized video recordings of the above were assessed by two independent and blinded assessors. A validated assessment tool utilizing a five-point Likert scale was employed. Construct validity was calculated by comparing scores across training levels and experts using mean and interquartile range of global and task-specific scores. Subgroup analysis of the intermediate group ascertained previous experience. Nonparametric descriptive statistics were used, and analysis was carried out using SPSS version 21 (IBM, Armonk, NY). Reliability of the assessment tool was confirmed. The model discriminated well between different levels of expertise in global and task-specific scores. A positive correlation was noted between year in training and both global and task-specific scores (P < .001). Experience of the intermediate group was variable, and the number of ESS procedures performed under supervision had the highest impact on performance. This study describes an alternative model for ESS training and assessment. It is also the first to demonstrate construct validity of the sheep-head model for ESS training. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  12. The development and validation of the clinicians' awareness towards cognitive errors (CATChES) in clinical decision making questionnaire tool.

    PubMed

    Chew, Keng Sheng; Kueh, Yee Cheng; Abdul Aziz, Adlihafizi

    2017-03-21

    Despite their importance on diagnostic accuracy, there is a paucity of literature on questionnaire tools to assess clinicians' awareness toward cognitive errors. A validation study was conducted to develop a questionnaire tool to evaluate the Clinician's Awareness Towards Cognitive Errors (CATChES) in clinical decision making. This questionnaire is divided into two parts. Part A is to evaluate the clinicians' awareness towards cognitive errors in clinical decision making while Part B is to evaluate their perception towards specific cognitive errors. Content validation for both parts was first determined followed by construct validation for Part A. Construct validation for Part B was not determined as the responses were set in a dichotomous format. For content validation, all items in both Part A and Part B were rated as "excellent" in terms of their relevance in clinical settings. For construct validation using exploratory factor analysis (EFA) for Part A, a two-factor model with total variance extraction of 60% was determined. Two items were deleted. Then, the EFA was repeated showing that all factor loadings are above the cut-off value of >0.5. The Cronbach's alpha for both factors are above 0.6. The CATChES questionnaire tool is a valid questionnaire tool aimed to evaluate the awareness among clinicians toward cognitive errors in clinical decision making.

  13. Computational modeling of radiofrequency ablation: evaluation on ex vivo data using ultrasound monitoring

    NASA Astrophysics Data System (ADS)

    Audigier, Chloé; Kim, Younsu; Dillow, Austin; Boctor, Emad M.

    2017-03-01

    Radiofrequency ablation (RFA) is the most widely used minimally invasive ablative therapy for liver cancer, but it is challenged by a lack of patient-specific monitoring. Inter-patient tissue variability and the presence of blood vessels make the prediction of the RFA difficult. A monitoring tool which can be personalized for a given patient during the intervention would be helpful to achieve a complete tumor ablation. However, the clinicians do not have access to such a tool, which results in incomplete treatment and a large number of recurrences. Computational models can simulate the phenomena and mechanisms governing this therapy. The temperature evolution as well as the resulted ablation can be modeled. When combined together with intraoperative measurements, computational modeling becomes an accurate and powerful tool to gain quantitative understanding and to enable improvements in the ongoing clinical settings. This paper shows how computational models of RFA can be evaluated using intra-operative measurements. First, simulations are used to demonstrate the feasibility of the method, which is then evaluated on two ex vivo datasets. RFA is simulated on a simplified geometry to generate realistic longitudinal temperature maps and the resulted necrosis. Computed temperatures are compared with the temperature evolution recorded using thermometers, and with temperatures monitored by ultrasound (US) in a 2D plane containing the ablation tip. Two ablations are performed on two cadaveric bovine livers, and we achieve error of 2.2 °C on average between the computed and the thermistors temperature and 1.4 °C and 2.7 °C on average between the temperature computed and monitored by US during the ablation at two different time points (t = 240 s and t = 900 s).

  14. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Volume 1: Study results

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.

  15. Ecosystem services valuation to support decisionmaking on public lands—A case study of the San Pedro River watershed, Arizona

    USGS Publications Warehouse

    Bagstad, Kenneth J.; Semmens, Darius; Winthrop, Rob; Jaworksi, Delilah; Larson, Joel

    2012-01-01

    This report details the findings of the Bureau of Land Management–U.S. Geological Survey Ecosystem Services Valuation Pilot Study. This project evaluated alternative methods and tools that quantify and value ecosystem services, and it assessed the tools’ readiness for use in the Bureau of Land Management decisionmaking process. We tested these tools on the San Pedro River watershed in northern Sonora, Mexico, and southeast Arizona. The study area includes the San Pedro Riparian National Conservation Area (managed by the Bureau of Land Management), which has been a focal point for conservation activities and scientific research in recent decades. We applied past site-specific primary valuation studies, value transfer, the Wildlife Habitat Benefits Estimation Toolkit, and the Integrated Valuation of Ecosystem Services and Tradeoffs (InVEST) and Artificial Intelligence for Ecosystem Services (ARIES) models to value locally important ecosystem services for the San Pedro River watershed—water, carbon, biodiversity, and cultural values. We tested these approaches on a series of scenarios to evaluate ecosystem service changes and the ability of the tools to accommodate scenarios. A suite of additional tools were either at too early a stage of development to run, were proprietary, or were place-specific tools inappropriate for application to the San Pedro River watershed. We described the strengths and weaknesses of these additional ecosystem service tools against a series of evaluative criteria related to their usefulness for Bureau of Land Management decisionmaking. Using these tools, we quantified gains or losses of ecosystem services under three categories of scenarios: urban growth, mesquite management, and water augmentation. These results quantify tradeoffs and could be useful for decisionmaking within Bureau of Land Management district or field offices. Results are accompanied by a relatively high level of uncertainty associated with model outputs, valuation methods, and discount rates applied. Further guidance on representing uncertainty and applying uncertain results in decisionmaking would benefit both tool developers and those offices in using ecosystem services to compare management tradeoffs. Decisionmakers and Bureau of Land Management managers at the State-, district-, and field-office level would also benefit from continuing model improvements, training, and guidance on tool use that can be provided by the U.S. Geological Survey, the Bureau of Land Management, and the Department of the Interior. Tradeoffs were identified in the level of effort needed to parameterize and run tools and the amount and quality of information they provide to the decision process. We found the Wildlife Habitat Benefits Estimation Toolkit, Ecosystem Services Review, and United Nations Environment Programme–World Conservation Monitoring Centre Ecosystem Services Toolkit to be immediately feasible for application by the Bureau of Land Management, given proper guidance on their use. It is also feasible for the Bureau of Land Management to use the InVEST model, but in early 2012 the process of parameterizing the model required resources and expertise that are unlikely to be available in most Bureau of Land Management district or field offices. Application of past primary valuation is feasible, but developing new primary-valuation studies is too time consuming for regular application. Value transfer approaches (aside from the Wildlife Habitat Benefits Estimation Toolkit) are best applied carefully on the basis of guidelines described in this report, to reduce transfer error. The ARIES model can provide useful information in regions modeled in the past (Arizona, California, Colorado, and Washington), but it lacks some features that will improve its usability, such as a generalized model that could be applied anywhere in the United States. Eleven other tools described in this report could become useful as the tools more fully develop, in high-profile cases for which additional resources are available for tool application or in case-study regions where place-specific models have already been developed. To improve the value of these tools in decisionmaking, we suggest scientific needs that agencies such as U.S. Geological Survey can help meet—for instance, development and support of data archives. Such archives could greatly reduce resource needs and improve the reliability and consistency of results. Given the rapid state of evolution in the field, periodic follow-up studies on ecosystem services tools would help to ensure that the Bureau of Land Management and other public land management agencies are kept up to date on new tools and features that bring ecosystem services closer to readiness for use in regular decisionmaking.

  16. Reflections on experiential learning in evaluation capacity building with a community organization, Dancing With Parkinson's.

    PubMed

    Nakaima, April; Sridharan, Sanjeev

    2017-05-08

    This paper discusses what was learned about evaluation capacity building with community organizations who deliver services to individuals with neurological disorders. Evaluation specialists engaged by the Ontario Brain Institute Evaluation Support Program were paired with community organizations, such as Dancing With Parkinson's. Some of the learning included: relationship building is key for this model of capacity building; community organizations often have had negative experiences with evaluation and the idea that evaluations can be friendly tools in implementing meaningful programs is one key mechanism by which such an initiative can work; community organizations often need evaluation most to be able to demonstrate their value; a strength of this initiative was that the focus was not just on creating products but mostly on developing a learning process in which capacities would remain; evaluation tools and skills that organizations found useful were developing a theory of change and the concept of heterogeneous mechanisms (informed by a realist evaluation lens). Copyright © 2017. Published by Elsevier Ltd.

  17. Modeling and simulation of five-axis virtual machine based on NX

    NASA Astrophysics Data System (ADS)

    Li, Xiaoda; Zhan, Xianghui

    2018-04-01

    Virtual technology in the machinery manufacturing industry has shown the role of growing. In this paper, the Siemens NX software is used to model the virtual CNC machine tool, and the parameters of the virtual machine are defined according to the actual parameters of the machine tool so that the virtual simulation can be carried out without loss of the accuracy of the simulation. How to use the machine builder of the CAM module to define the kinematic chain and machine components of the machine is described. The simulation of virtual machine can provide alarm information of tool collision and over cutting during the process to users, and can evaluate and forecast the rationality of the technological process.

  18. New machine learning tools for predictive vegetation mapping after climate change: Bagging and Random Forest perform better than Regression Tree Analysis

    Treesearch

    L.R. Iverson; A.M. Prasad; A. Liaw

    2004-01-01

    More and better machine learning tools are becoming available for landscape ecologists to aid in understanding species-environment relationships and to map probable species occurrence now and potentially into the future. To thal end, we evaluated three statistical models: Regression Tree Analybib (RTA), Bagging Trees (BT) and Random Forest (RF) for their utility in...

  19. Measuring Educators' Perceptions of Their Skills Relative to Response to Intervention: A Psychometric Study of a Survey Tool

    ERIC Educational Resources Information Center

    Castillo, Jose M.; March, Amanda L.; Stockslager, Kevin M.; Hines, Constance V.

    2016-01-01

    The "Perceptions of RtI Skills Survey" is a self-report measure that assesses educators' perceptions of their data-based problem-solving skills--a critical element of many Response-to-Intervention (RtI) models. Confirmatory factor analysis (CFA) was used to evaluate the underlying factor structure of this tool. Educators from 68 (n =…

  20. Pond and Irrigation Model (PIM): a tool for simultaneously evaluating pond water availability and crop irrigation demand

    Treesearch

    Ying Ouyang; Gary Feng; Theodor D. Leininger; John Read; Johnie N. Jenkins

    2018-01-01

    Agricultural ponds are an important alternative source of water for crop irrigation to conserve surface and ground water resources. In recent years more such ponds have been constructed in Mississippi and around the world. There is currently, however, a lack of a tool to simultaneously estimate crop irrigation demand and pond water availability. In this study, a Pond-...

  1. General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets

    NASA Technical Reports Server (NTRS)

    Marchen, Luis F.

    2011-01-01

    The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.

  2. Clinical Trial Assessment of Infrastructure Matrix Tool to Improve the Quality of Research Conduct in the Community.

    PubMed

    Dimond, Eileen P; Zon, Robin T; Weiner, Bryan J; St Germain, Diane; Denicoff, Andrea M; Dempsey, Kandie; Carrigan, Angela C; Teal, Randall W; Good, Marjorie J; McCaskill-Stevens, Worta; Grubbs, Stephen S; Dimond, Eileen P; Zon, Robin T; Weiner, Bryan J; St Germain, Diane; Denicoff, Andrea M; Dempsey, Kandie; Carrigan, Angela C; Teal, Randall W; Good, Marjorie J; McCaskill-Stevens, Worta; Grubbs, Stephen S

    2016-01-01

    Several publications have described minimum standards and exemplary attributes for clinical trial sites to improve research quality. The National Cancer Institute (NCI) Community Cancer Centers Program (NCCCP) developed the clinical trial Best Practice Matrix tool to facilitate research program improvements through annual self-assessments and benchmarking. The tool identified nine attributes, each with three progressive levels, to score clinical trial infrastructural elements from less to more exemplary. The NCCCP sites correlated tool use with research program improvements, and the NCI pursued a formative evaluation to refine the interpretability and measurability of the tool. From 2011 to 2013, 21 NCCCP sites self-assessed their programs with the tool annually. During 2013 to 2014, NCI collaborators conducted a five-step formative evaluation of the matrix tool. Sites reported significant increases in level-three scores across the original nine attributes combined (P<.001). Two specific attributes exhibited significant change: clinical trial portfolio diversity and management (P=.0228) and clinical trial communication (P=.0281). The formative evaluation led to revisions, including renaming the Best Practice Matrix as the Clinical Trial Assessment of Infrastructure Matrix (CT AIM), expanding infrastructural attributes from nine to 11, clarifying metrics, and developing a new scoring tool. Broad community input, cognitive interviews, and pilot testing improved the usability and functionality of the tool. Research programs are encouraged to use the CT AIM to assess and improve site infrastructure. Experience within the NCCCP suggests that the CT AIM is useful for improving quality, benchmarking research performance, reporting progress, and communicating program needs with institutional leaders. The tool model may also be useful in disciplines beyond oncology.

  3. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  4. Evaluating the accuracy of VEMAP daily weather data for application in crop simulations on a regional scale

    USDA-ARS?s Scientific Manuscript database

    Weather plays a critical role in eco-environmental and agricultural systems. Limited availability of meteorological records often constrains the applications of simulation models and related decision support tools. The Vegetation/Ecosystem Modeling and Analysis Project (VEMAP) provides daily weather...

  5. Evaluation of satellite-based, modeled-derived daily solar radiation data for the continental U.S.

    USDA-ARS?s Scientific Manuscript database

    Many applications of simulation models and related decision support tools for agriculture and natural resource management require daily meteorological data as inputs. Availability and quality of such data, however, often constrain research and decision support activities that require use of these to...

  6. Population Models for Stream Fish Response to Habitat and Hydrologic Alteration: the CVI Watershed Tool. EPA/600/R-04/190

    EPA Pesticide Factsheets

    Models that predict the responses of fish populations and communities to key habitat characteristics are necessary for CVIs watershed management goals, for determining where to restore and how, as well as evaluating the most probable outcome.

  7. Helping Students Take Control: A Model of Advising. Student Success Center.

    ERIC Educational Resources Information Center

    Moon, Brenda G.; Boland, Reginald

    This handbook presents Rowan-Cabarrus Community College's (North Carolina) model of academic advising as embodied by its Student Success Center. Divided into seven parts, this handbook contains the following sections: a mission statement, Early Alert forms, evaluation tool, forms section, organizational chart of pre-college program, brochure, and…

  8. The 6-E Learning Model

    ERIC Educational Resources Information Center

    Chessin, Debby A.; Moore, Virginia J.

    2004-01-01

    Most of teachers are familiar with the 5-E model of science instruction-engage, Explore, Explain, Expand, and Evaluate (Trowbridge and Bybee 1990). It is a valuable tool that allows teachers to structure science experiences so students use the processes of scientific inquiry to construct and connect ideas rather than simply memorize seemingly…

  9. Evaluating the suitability of the Soil Vulnerability Index (SVI) classification scheme using the SWAT model

    USDA-ARS?s Scientific Manuscript database

    Conservation practices are effective ways to mitigate non-point source pollution, especially when implemented on critical source areas (CSAs) known to be the areas contributing disproportionately to high pollution loads. Although hydrologic models are promising tools to identify CSAs within agricul...

  10. A GIS-assisted regional screening tool to evaluate the leaching potential of volatile and non-volatile pesticides

    NASA Astrophysics Data System (ADS)

    Ki, Seo Jin; Ray, Chittaranjan

    2015-03-01

    A regional screening tool-which is useful in cases where few site-specific parameters are available for complex vadose zone models-assesses the leaching potential of pollutants to groundwater over large areas. In this study, the previous pesticide leaching tool used in Hawaii was revised to account for the release of new volatile organic compounds (VOCs) from the soil surface. The tool was modified to introduce expanded terms in the traditional pesticide ranking indices (i.e., retardation and attenuation factors), allowing the estimation of the leaching fraction of volatile chemicals based on recharge, soil, and chemical properties to be updated. Results showed that the previous tool significantly overestimated the mass fraction of VOCs leached through soils as the recharge rates increased above 0.001801 m/d. In contrast, the revised tool successfully delineated vulnerable areas to the selected VOCs based on two reference chemicals, a known leacher and non-leacher, which were determined in local conditions. The sensitivity analysis with the Latin-Hypercube-One-factor-At-a-Time method revealed that the new leaching tool was most sensitive to changes in the soil organic carbon sorption coefficient, fractional organic carbon content, and Henry's law constant; and least sensitive to parameters such as the bulk density, water content at field capacity, and particle density in soils. When the revised tool was compared to the analytical (STANMOD) and numerical (HYDRUS-1D) models as a susceptibility measure, it ranked particular VOCs well (e.g., benzene, carbofuran, and toluene) that were consistent with other two models under the given conditions. Therefore, the new leaching tool can be widely used to address intrinsic groundwater vulnerability to contamination of pesticides and VOCs, along with the DRASTIC method or similar Tier 1 models such as SCI-GROW and WIN-PST.

  11. Integrating and Managing Bim in GIS, Software Review

    NASA Astrophysics Data System (ADS)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  12. Commercial Building Energy Asset Score

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software (Asset Scoring Tool) is designed to help building owners and managers to gain insight into the as-built efficiency of their buildings. It is a web tool where users can enter their building information and obtain an asset score report. The asset score report consists of modeled building energy use (by end use and by fuel type), building systems (envelope, lighting, heating, cooling, service hot water) evaluations, and recommended energy efficiency measures. The intended users are building owners and operators who have limited knowledge of building energy efficiency. The scoring tool collects minimum building data (~20 data entries) frommore » users and build a full-scale energy model using the inference functionalities from Facility Energy Decision System (FEDS). The scoring tool runs real-time building energy simulation using EnergyPlus and performs life-cycle cost analysis using FEDS. An API is also under development to allow the third-party applications to exchange data with the web service of the scoring tool.« less

  13. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    NASA Astrophysics Data System (ADS)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision making.

  14. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  15. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  16. FracFit: A Robust Parameter Estimation Tool for Anomalous Transport Problems

    NASA Astrophysics Data System (ADS)

    Kelly, J. F.; Bolster, D.; Meerschaert, M. M.; Drummond, J. D.; Packman, A. I.

    2016-12-01

    Anomalous transport cannot be adequately described with classical Fickian advection-dispersion equations (ADE). Rather, fractional calculus models may be used, which capture non-Fickian behavior (e.g. skewness and power-law tails). FracFit is a robust parameter estimation tool based on space- and time-fractional models used to model anomalous transport. Currently, four fractional models are supported: 1) space fractional advection-dispersion equation (sFADE), 2) time-fractional dispersion equation with drift (TFDE), 3) fractional mobile-immobile equation (FMIE), and 4) tempered fractional mobile-immobile equation (TFMIE); additional models may be added in the future. Model solutions using pulse initial conditions and continuous injections are evaluated using stable distribution PDFs and CDFs or subordination integrals. Parameter estimates are extracted from measured breakthrough curves (BTCs) using a weighted nonlinear least squares (WNLS) algorithm. Optimal weights for BTCs for pulse initial conditions and continuous injections are presented, facilitating the estimation of power-law tails. Two sample applications are analyzed: 1) continuous injection laboratory experiments using natural organic matter and 2) pulse injection BTCs in the Selke river. Model parameters are compared across models and goodness-of-fit metrics are presented, assisting model evaluation. The sFADE and time-fractional models are compared using space-time duality (Baeumer et. al., 2009), which links the two paradigms.

  17. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed Central

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921

  18. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  19. Modeling potential hydrochemical responses to climate change and rising CO2 at the Hubbard Brook Experimental Forest using a dynamic biogeochemical model (PnET-BGC)

    Treesearch

    Afshin Pourmokhtarian; Charles T. Driscoll; John L. Campbell; Katharine Hayhoe

    2012-01-01

    Dynamic hydrochemical models are useful tools for understanding and predicting the interactive effects of climate change, atmospheric CO2, and atmospheric deposition on the hydrology and water quality of forested watersheds. We used the biogeochemical model, PnET-BGC, to evaluate the effects of potential future changes in temperature,...

  20. Determination and representation of electric charge distributions associated with adverse weather conditions

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    1992-01-01

    Algorithms are presented for determining the size and location of electric charges which model storm systems and lightning strikes. The analysis utilizes readings from a grid of ground level field mills and geometric constraints on parameters to arrive at a representative set of charges. This set is used to generate three dimensional graphical depictions of the set as well as contour maps of the ground level electrical environment over the grid. The composite, analytic and graphic package is demonstrated and evaluated using controlled input data and archived data from a storm system. The results demonstrate the packages utility as: an operational tool in appraising adverse weather conditions; a research tool in studies of topics such as storm structure, storm dynamics, and lightning; and a tool in designing and evaluating grid systems.

Top