Sample records for validated system models

  1. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  2. Protocol for Reliability Assessment of Structural Health Monitoring Systems Incorporating Model-assisted Probability of Detection (MAPOD) Approach

    DTIC Science & Technology

    2011-09-01

    a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range

  3. Some guidance on preparing validation plans for the DART Full System Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy

    2009-03-01

    Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generallymore » applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.« less

  4. The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment

    NASA Technical Reports Server (NTRS)

    Clement, Bradley J.; Frank, Jeremy D.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    A principal obstacle to fielding automated planning systems is the difficulty of modeling. Physical systems are modeled conventionally based on specification documents and the modeler's understanding of the system. Thus, the model is developed in a way that is disconnected from the system's actual behavior and is vulnerable to manual error. Another obstacle to fielding planners is testing and validation. For a space mission, generated plans must be validated often by translating them into command sequences that are run in a simulation testbed. Testing in this way is complex and onerous because of the large number of possible plans and states of the spacecraft. Though, if used as a source of domain knowledge, the simulator can ease validation. This paper poses a challenge: to ground planning models in the system physics represented by simulation. A proposed, interactive model development environment illustrates the integration of planning and simulation to meet the challenge. This integration reveals research paths for automated model construction and validation.

  5. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  6. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  7. An Approach to Comprehensive and Sustainable Solar Wind Model Validation

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; MacNeice, P. J.; Mays, M. L.; Boblitt, J. M.; Wiegand, C.

    2017-12-01

    The number of models of the corona and inner heliosphere and of their updates and upgrades grows steadily, as does the number and character of the model inputs. Maintaining up to date validation of these models, in the face of this constant model evolution, is a necessary but very labor intensive activity. In the last year alone, both NASA's LWS program and the CCMC's ongoing support of model forecasting activities at NOAA SWPC have sought model validation reports on the quality of all aspects of the community's coronal and heliospheric models, including both ambient and CME related wind solutions at L1. In this presentation I will give a brief review of the community's previous model validation results of L1 wind representation. I will discuss the semi-automated web based system we are constructing at the CCMC to present comparative visualizations of all interesting aspects of the solutions from competing models.This system is designed to be easily queried to provide the essential comprehensive inputs to repeat andupdate previous validation studies and support extensions to them. I will illustrate this by demonstrating how the system is being used to support the CCMC/LWS Model Assessment Forum teams focused on the ambient and time dependent corona and solar wind, including CME arrival time and IMF Bz.I will also discuss plans to extend the system to include results from the Forum teams addressing SEP model validation.

  8. Validity of Sensory Systems as Distinct Constructs

    PubMed Central

    Su, Chia-Ting

    2014-01-01

    This study investigated the validity of sensory systems as distinct measurable constructs as part of a larger project examining Ayres’s theory of sensory integration. Confirmatory factor analysis (CFA) was conducted to test whether sensory questionnaire items represent distinct sensory system constructs. Data were obtained from clinical records of two age groups, 2- to 5-yr-olds (n = 231) and 6- to 10-yr-olds (n = 223). With each group, we tested several CFA models for goodness of fit with the data. The accepted model was identical for each group and indicated that tactile, vestibular–proprioceptive, visual, and auditory systems form distinct, valid factors that are not age dependent. In contrast, alternative models that grouped items according to sensory processing problems (e.g., over- or underresponsiveness within or across sensory systems) did not yield valid factors. Results indicate that distinct sensory system constructs can be measured validly using questionnaire data. PMID:25184467

  9. Improved Conceptual Models Methodology (ICoMM) for Validation of Non-Observable Systems

    DTIC Science & Technology

    2015-12-01

    distribution is unlimited IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE SYSTEMS by Sang M. Sok December 2015...REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE IMPROVED CONCEPTUAL MODELS METHODOLOGY (ICoMM) FOR VALIDATION OF NON-OBSERVABLE...importance of the CoM. The improved conceptual model methodology (ICoMM) is developed in support of improving the structure of the CoM for both face and

  10. Object-Oriented Modeling of an Energy Harvesting System Based on Thermoelectric Generators

    NASA Astrophysics Data System (ADS)

    Nesarajah, Marco; Frey, Georg

    This paper deals with the modeling of an energy harvesting system based on thermoelectric generators (TEG), and the validation of the model by means of a test bench. TEGs are capable to improve the overall energy efficiency of energy systems, e.g. combustion engines or heating systems, by using the remaining waste heat to generate electrical power. Previously, a component-oriented model of the TEG itself was developed in Modelica® language. With this model any TEG can be described and simulated given the material properties and the physical dimension. Now, this model was extended by the surrounding components to a complete model of a thermoelectric energy harvesting system. In addition to the TEG, the model contains the cooling system, the heat source, and the power electronics. To validate the simulation model, a test bench was built and installed on an oil-fired household heating system. The paper reports results of the measurements and discusses the validity of the developed simulation models. Furthermore, the efficiency of the proposed energy harvesting system is derived and possible improvements based on design variations tested in the simulation model are proposed.

  11. Validation of an Evaluation Model for Learning Management Systems

    ERIC Educational Resources Information Center

    Kim, S. W.; Lee, M. G.

    2008-01-01

    This study aims to validate a model for evaluating learning management systems (LMS) used in e-learning fields. A survey of 163 e-learning experts, regarding 81 validation items developed through literature review, was used to ascertain the importance of the criteria. A concise list of explanatory constructs, including two principle factors, was…

  12. ASTP ranging system mathematical model

    NASA Technical Reports Server (NTRS)

    Ellis, M. R.; Robinson, L. H.

    1973-01-01

    A mathematical model is presented of the VHF ranging system to analyze the performance of the Apollo-Soyuz test project (ASTP). The system was adapted for use in the ASTP. The ranging system mathematical model is presented in block diagram form, and a brief description of the overall model is also included. A procedure for implementing the math model is presented along with a discussion of the validation of the math model and the overall summary and conclusions of the study effort. Detailed appendices of the five study tasks are presented: early late gate model development, unlock probability development, system error model development, probability of acquisition and model development, and math model validation testing.

  13. Validation of a Simulation Model of Intrinsic Lutetium-176 Activity in LSO-Based Preclinical PET Systems

    NASA Astrophysics Data System (ADS)

    McIntosh, Bryan

    The LSO scintillator crystal commonly used in PET scanners contains a low level of intrinsic radioactivity due to a small amount of Lu-176. This is not usually a concern in routine scanning but can become an issue in small animal imaging, especially when imaging low tracer activity levels. Previously there had been no systematic validation of simulations of this activity; this thesis discusses the validation of a GATE model of intrinsic Lu-176 against results from a bench-top pair of detectors and a Siemens Inveon preclinical PET system. The simulation results matched those from the bench-top system very well, but did not agree as well with results from the complete Inveon system due to a drop-off in system sensitivity at low energies that was not modelled. With this validation the model can now be used with confidence to predict the effects of Lu-176 activity in future PET systems.

  14. The CMEMS-Med-MFC-Biogeochemistry operational system: implementation of NRT and Multi-Year validation tools

    NASA Astrophysics Data System (ADS)

    Salon, Stefano; Cossarini, Gianpiero; Bolzon, Giorgio; Teruzzi, Anna

    2017-04-01

    The Mediterranean Monitoring and Forecasting Centre (Med-MFC) is one of the regional production centres of the EU Copernicus Marine Environment Monitoring Service (CMEMS). Med-MFC manages a suite of numerical model systems for the operational delivery of the CMEMS products, providing continuous monitoring and forecasting of the Mediterranean marine environment. The CMEMS products of fundamental biogeochemical variables (chlorophyll, nitrate, phosphate, oxygen, phytoplankton biomass, primary productivity, pH, pCO2) are organised as gridded datasets and are available at the marine.copernicus.eu web portal. Quantitative estimates of CMEMS products accuracy are prerequisites to release reliable information to intermediate users, end users and to other downstream services. In particular, validation activities aim to deliver accuracy information of the model products and to serve as a long term monitoring of the performance of the modelling systems. The quality assessment of model output is implemented using a multiple-stages approach, basically inspired to the classic "GODAE 4 Classes" metrics and criteria (consistency, quality, performance and benefit). Firstly, pre-operational runs qualify the operational model system against historical data, also providing a verification of the improvements of the new model system release with respect to the previous version. Then, the near real time (NRT) validation aims at delivering a sustained on-line skill assessment of the model analysis and forecast, relying on the NRT available relevant observations (e.g. in situ, Bio Argo and satellite observations). NRT validation results are operated on weekly basis and published on the MEDEAF web portal (www.medeaf.inogs.it). On a quarterly basis, the integration of the NRT validation activities delivers a comprehensive view of the accuracy of model forecast through the official CMEMS validation webpage. Multi-Year production (e.g. reanalysis runs) follows a similar procedure, and the validation is achieved using the same metrics on available historical observations (e.g. the World Ocean Atlas 2013 dataset). Results of the validation activities show that the comparison of the different variables of the CMEMS products with experimental data is feasible at different levels (i.e. either as skill assessment of the short-term forecast and as model consistency through different system versions) and at different spatial and temporal scales. In particular, the accuracy of some variables (chlorophyll, nitrate, oxygen) can be provided at weekly scale and sub-mesoscale, others (carbonate system, phosphate) at quarterly/annual and sub-basin scale, and others (phytoplankton biomass, primary production) only at the level of consistency of model functioning (e.g. literature- or climatology-based). In spite of a wide literature on model validation has been produced so far, maintaining a validation framework in the biogeochemical operational contest that fulfils GODAE criteria is still a challenge. Recent results of the validation activities and new potential validation framework at the Med-MFC will be presented in our contribution.

  15. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    PubMed

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  16. Ada Compiler Validation Summary Report. Certificate Number: 900726W1. 11017, Verdix Corporation VADS IBM RISC System/6000, AIX 3.1, VAda-110-7171, Version 6.0 IBM RISC System/6000 Model 530 = IBM RISC System/6000 Model 530

    DTIC Science & Technology

    1991-01-22

    Customer Agreement Number: 90-05-29- VRX See Section 3.1 for any additional information about the testing environment. As a result of this validation...22 January 1991 90-05-29- VRX Ada COMPILER VALIDATION SUMMARY REPORT: Certificate Number: 900726W1.11017 Verdix Corporation VADS IBM RISC System/6000

  17. Simulation validation of the XV-15 tilt-rotor research aircraft

    NASA Technical Reports Server (NTRS)

    Ferguson, S. W.; Hanson, G. D.; Churchill, G. B.

    1984-01-01

    The results of a simulation validation program of the XV-15 tilt-rotor research aircraft are detailed, covering such simulation aspects as the mathematical model, visual system, motion system, cab aural system, cab control loader system, pilot perceptual fidelity, and generic tilt rotor applications. Simulation validation was performed for the hover, low-speed, and sideward flight modes, with consideration of the in-ground rotor effect. Several deficiencies of the mathematical model and the simulation systems were identified in the course of the simulation validation project, and some were corrected. It is noted that NASA's Vertical Motion Simulator used in the program is an excellent tool for tilt-rotor and rotorcraft design, development, and pilot training.

  18. Economic analysis of model validation for a challenge problem

    DOE PAGES

    Paez, Paul J.; Paez, Thomas L.; Hasselman, Timothy K.

    2016-02-19

    It is now commonplace for engineers to build mathematical models of the systems they are designing, building, or testing. And, it is nearly universally accepted that phenomenological models of physical systems must be validated prior to use for prediction in consequential scenarios. Yet, there are certain situations in which testing only or no testing and no modeling may be economically viable alternatives to modeling and its associated testing. This paper develops an economic framework within which benefit–cost can be evaluated for modeling and model validation relative to other options. The development is presented in terms of a challenge problem. Asmore » a result, we provide a numerical example that quantifies when modeling, calibration, and validation yield higher benefit–cost than a testing only or no modeling and no testing option.« less

  19. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    NASA Technical Reports Server (NTRS)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  20. Theoretical relationship between vibration transmissibility and driving-point response functions of the human body.

    PubMed

    Dong, Ren G; Welcome, Daniel E; McDowell, Thomas W; Wu, John Z

    2013-11-25

    The relationship between the vibration transmissibility and driving-point response functions (DPRFs) of the human body is important for understanding vibration exposures of the system and for developing valid models. This study identified their theoretical relationship and demonstrated that the sum of the DPRFs can be expressed as a linear combination of the transmissibility functions of the individual mass elements distributed throughout the system. The relationship is verified using several human vibration models. This study also clarified the requirements for reliably quantifying transmissibility values used as references for calibrating the system models. As an example application, this study used the developed theory to perform a preliminary analysis of the method for calibrating models using both vibration transmissibility and DPRFs. The results of the analysis show that the combined method can theoretically result in a unique and valid solution of the model parameters, at least for linear systems. However, the validation of the method itself does not guarantee the validation of the calibrated model, because the validation of the calibration also depends on the model structure and the reliability and appropriate representation of the reference functions. The basic theory developed in this study is also applicable to the vibration analyses of other structures.

  1. Development and validation of a piloted simulation of a helicopter and external sling load

    NASA Technical Reports Server (NTRS)

    Shaughnessy, J. D.; Deaux, T. N.; Yenni, K. R.

    1979-01-01

    A generalized, real time, piloted, visual simulation of a single rotor helicopter, suspension system, and external load is described and validated for the full flight envelope of the U.S. Army CH-54 helicopter and cargo container as an example. The mathematical model described uses modified nonlinear classical rotor theory for both the main rotor and tail rotor, nonlinear fuselage aerodynamics, an elastic suspension system, nonlinear load aerodynamics, and a loadground contact model. The implementation of the mathematical model on a large digital computing system is described, and validation of the simulation is discussed. The mathematical model is validated by comparing measured flight data with simulated data, by comparing linearized system matrices, eigenvalues, and eigenvectors with manufacturers' data, and by the subjective comparison of handling characteristics by experienced pilots. A visual landing display system for use in simulation which generates the pilot's forward looking real world display was examined and a special head up, down looking load/landing zone display is described.

  2. A model for plant lighting system selection.

    PubMed

    Ciolkosz, D E; Albright, L D; Sager, J C; Langhans, R W

    2002-01-01

    A decision model is presented that compares lighting systems for a plant growth scenario and chooses the most appropriate system from a given set of possible choices. The model utilizes a Multiple Attribute Utility Theory approach, and incorporates expert input and performance simulations to calculate a utility value for each lighting system being considered. The system with the highest utility is deemed the most appropriate system. The model was applied to a greenhouse scenario, and analyses were conducted to test the model's output for validity. Parameter variation indicates that the model performed as expected. Analysis of model output indicates that differences in utility among the candidate lighting systems were sufficiently large to give confidence that the model's order of selection was valid.

  3. Validity and Realibility of Chemistry Systemic Multiple Choices Questions (CSMCQs)

    ERIC Educational Resources Information Center

    Priyambodo, Erfan; Marfuatun

    2016-01-01

    Nowadays, Rasch model analysis is used widely in social research, moreover in educational research. In this research, Rasch model is used to determine the validation and the reliability of systemic multiple choices question in chemistry teaching and learning. There were 30 multiple choices question with systemic approach for high school student…

  4. A Hardware Model Validation Tool for Use in Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.

    2010-01-01

    One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.

  5. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment...—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) For the following continuous emission monitoring systems Use the following methods in appendix A of this part to validate poollutant concentratin...

  6. 40 CFR Table 6 to Subpart Bbbb of... - Model Rule-Requirements for Validating Continuous Emission Monitoring Systems (CEMS)

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Continuous Emission Monitoring Systems (CEMS) 6 Table 6 to Subpart BBBB of Part 60 Protection of Environment...—Requirements for Validating Continuous Emission Monitoring Systems (CEMS) For the following continuous emission monitoring systems Use the following methods in appendix A of this part to validate poollutant concentratin...

  7. From control to causation: Validating a 'complex systems model' of running-related injury development and prevention.

    PubMed

    Hulme, A; Salmon, P M; Nielsen, R O; Read, G J M; Finch, C F

    2017-11-01

    There is a need for an ecological and complex systems approach for better understanding the development and prevention of running-related injury (RRI). In a previous article, we proposed a prototype model of the Australian recreational distance running system which was based on the Systems Theoretic Accident Mapping and Processes (STAMP) method. That model included the influence of political, organisational, managerial, and sociocultural determinants alongside individual-level factors in relation to RRI development. The purpose of this study was to validate that prototype model by drawing on the expertise of both systems thinking and distance running experts. This study used a modified Delphi technique involving a series of online surveys (December 2016- March 2017). The initial survey was divided into four sections containing a total of seven questions pertaining to different features associated with the prototype model. Consensus in opinion about the validity of the prototype model was reached when the number of experts who agreed or disagreed with survey statement was ≥75% of the total number of respondents. A total of two Delphi rounds was needed to validate the prototype model. Out of a total of 51 experts who were initially contacted, 50.9% (n = 26) completed the first round of the Delphi, and 92.3% (n = 24) of those in the first round participated in the second. Most of the 24 full participants considered themselves to be a running expert (66.7%), and approximately a third indicated their expertise as a systems thinker (33.3%). After the second round, 91.7% of the experts agreed that the prototype model was a valid description of the Australian distance running system. This is the first study to formally examine the development and prevention of RRI from an ecological and complex systems perspective. The validated model of the Australian distance running system facilitates theoretical advancement in terms of identifying practical system-wide opportunities for the implementation of sustainable RRI prevention interventions. This 'big picture' perspective represents the first step required when thinking about the range of contributory causal factors that affect other system elements, as well as runners' behaviours in relation to RRI risk. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A Baseline Patient Model to Support Testing of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Perkusich, Mirko; Almeida, Hyggo O; Perkusich, Angelo; Lima, Mateus A M; Gorgônio, Kyller C

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are currently a trending topic of research. The main challenges are related to the integration and interoperability of connected medical devices, patient safety, physiologic closed-loop control, and the verification and validation of these systems. In this paper, we focus on patient safety and MCPS validation. We present a formal patient model to be used in health care systems validation without jeopardizing the patient's health. To determine the basic patient conditions, our model considers the four main vital signs: heart rate, respiratory rate, blood pressure and body temperature. To generate the vital signs we used regression models based on statistical analysis of a clinical database. Our solution should be used as a starting point for a behavioral patient model and adapted to specific clinical scenarios. We present the modeling process of the baseline patient model and show its evaluation. The conception process may be used to build different patient models. The results show the feasibility of the proposed model as an alternative to the immediate need for clinical trials to test these medical systems.

  9. Validation of the measure automobile emissions model : a statistical analysis

    DOT National Transportation Integrated Search

    2000-09-01

    The Mobile Emissions Assessment System for Urban and Regional Evaluation (MEASURE) model provides an external validation capability for hot stabilized option; the model is one of several new modal emissions models designed to predict hot stabilized e...

  10. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-02-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in-situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols and greenhouse gases, and is based on the Integrated Forecast System of the ECMWF. The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past three years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  11. Flight-Test Validation and Flying Qualities Evaluation of a Rotorcraft UAV Flight Control System

    NASA Technical Reports Server (NTRS)

    Mettler, Bernard; Tuschler, Mark B.; Kanade, Takeo

    2000-01-01

    This paper presents a process of design and flight-test validation and flying qualities evaluation of a flight control system for a rotorcraft-based unmanned aerial vehicle (RUAV). The keystone of this process is an accurate flight-dynamic model of the aircraft, derived by using system identification modeling. The model captures the most relevant dynamic features of our unmanned rotorcraft, and explicitly accounts for the presence of a stabilizer bar. Using the identified model we were able to determine the performance margins of our original control system and identify limiting factors. The performance limitations were addressed and the attitude control system was 0ptimize.d for different three performance levels: slow, medium, fast. The optimized control laws will be implemented in our RUAV. We will first determine the validity of our control design approach by flight test validating our optimized controllers. Subsequently, we will fly a series of maneuvers with the three optimized controllers to determine the level of flying qualities that can be attained. The outcome enable us to draw important conclusions on the flying qualities requirements for small-scale RUAVs.

  12. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  13. Hierarchical multi-scale approach to validation and uncertainty quantification of hyper-spectral image modeling

    NASA Astrophysics Data System (ADS)

    Engel, Dave W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David L.; Thompson, Sandra E.

    2016-05-01

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensor level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.

  14. Cultural Geography Model Validation

    DTIC Science & Technology

    2010-03-01

    the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S

  15. Prototype of NASA's Global Precipitation Measurement Mission Ground Validation System

    NASA Technical Reports Server (NTRS)

    Schwaller, M. R.; Morris, K. R.; Petersen, W. A.

    2007-01-01

    NASA is developing a Ground Validation System (GVS) as one of its contributions to the Global Precipitation Mission (GPM). The GPM GVS provides an independent means for evaluation, diagnosis, and ultimately improvement of GPM spaceborne measurements and precipitation products. NASA's GPM GVS consists of three elements: field campaigns/physical validation, direct network validation, and modeling and simulation. The GVS prototype of direct network validation compares Tropical Rainfall Measuring Mission (TRMM) satellite-borne radar data to similar measurements from the U.S. national network of operational weather radars. A prototype field campaign has also been conducted; modeling and simulation prototypes are under consideration.

  16. Validation Test Report for the 1/8 deg Global Navy Coastal Ocean Model Nowcast/Forecast System

    DTIC Science & Technology

    2007-01-24

    Test Report for the 1/8° Global Navy Coastal Ocean Model Nowcast/Forecast System Charlie N. BarroN a. Birol Kara roBert C. rhodes ClarK rowley......OF ACRONYMS ......................................................................48 VALIDATION TEST REPORT FOR THE 1/8° GLOBAL NAVY COASTAL

  17. Validating the Technology Acceptance Model in the Context of the Laboratory Information System-Electronic Health Record Interface System

    ERIC Educational Resources Information Center

    Aquino, Cesar A.

    2014-01-01

    This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…

  18. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less

  19. Quantitative model validation of manipulative robot systems

    NASA Astrophysics Data System (ADS)

    Kartowisastro, Iman Herwidiana

    This thesis is concerned with applying the distortion quantitative validation technique to a robot manipulative system with revolute joints. Using the distortion technique to validate a model quantitatively, the model parameter uncertainties are taken into account in assessing the faithfulness of the model and this approach is relatively more objective than the commonly visual comparison method. The industrial robot is represented by the TQ MA2000 robot arm. Details of the mathematical derivation of the distortion technique are given which explains the required distortion of the constant parameters within the model and the assessment of model adequacy. Due to the complexity of a robot model, only the first three degrees of freedom are considered where all links are assumed rigid. The modelling involves the Newton-Euler approach to obtain the dynamics model, and the Denavit-Hartenberg convention is used throughout the work. The conventional feedback control system is used in developing the model. The system behavior to parameter changes is investigated as some parameters are redundant. This work is important so that the most important parameters to be distorted can be selected and this leads to a new term called the fundamental parameters. The transfer function approach has been chosen to validate an industrial robot quantitatively against the measured data due to its practicality. Initially, the assessment of the model fidelity criterion indicated that the model was not capable of explaining the transient record in term of the model parameter uncertainties. Further investigations led to significant improvements of the model and better understanding of the model properties. After several improvements in the model, the fidelity criterion obtained was almost satisfied. Although the fidelity criterion is slightly less than unity, it has been shown that the distortion technique can be applied in a robot manipulative system. Using the validated model, the importance of friction terms in the model was highlighted with the aid of the partition control technique. It was also shown that the conventional feedback control scheme was insufficient for a robot manipulative system due to high nonlinearity which was inherent in the robot manipulator.

  20. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  1. Development and Validation of Methodology to Model Flow in Ventilation Systems Commonly Found in Nuclear Facilities. Phase I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strons, Philip; Bailey, James L.; Davis, John

    2016-03-01

    In this work, we apply the CFD in modeling airflow and particulate transport. This modeling is then compared to field validation studies to both inform and validate the modeling assumptions. Based on the results of field tests, modeling assumptions and boundary conditions are refined and the process is repeated until the results are found to be reliable with a high level of confidence.

  2. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  3. Modeling the Earth System, volume 3

    NASA Technical Reports Server (NTRS)

    Ojima, Dennis (Editor)

    1992-01-01

    The topics covered fall under the following headings: critical gaps in the Earth system conceptual framework; development needs for simplified models; and validating Earth system models and their subcomponents.

  4. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  5. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    PubMed

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  6. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  7. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    NASA Technical Reports Server (NTRS)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  8. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.

    Validating predictive models and quantifying uncertainties inherent in the modeling process is a critical component of the HARD Solids Venture program [1]. Our current research focuses on validating physics-based models predicting the optical properties of solid materials for arbitrary surface morphologies and characterizing the uncertainties in these models. We employ a systematic and hierarchical approach by designing physical experiments and comparing the experimental results with the outputs of computational predictive models. We illustrate this approach through an example comparing a micro-scale forward model to an idealized solid-material system and then propagating the results through a system model to the sensormore » level. Our efforts should enhance detection reliability of the hyper-spectral imaging technique and the confidence in model utilization and model outputs by users and stakeholders.« less

  9. Validation of reactive gases and aerosols in the MACC global analysis and forecast system

    NASA Astrophysics Data System (ADS)

    Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.

    2015-11-01

    The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols, and greenhouse gases, and is based on the Integrated Forecasting System of the European Centre for Medium-Range Weather Forecasts (ECMWF). The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past 3 years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high-pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.

  10. Analytical Formulation for Sizing and Estimating the Dimensions and Weight of Wind Turbine Hub and Drivetrain Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Parsons, T.; King, R.

    This report summarizes the theory, verification, and validation of a new sizing tool for wind turbine drivetrain components, the Drivetrain Systems Engineering (DriveSE) tool. DriveSE calculates the dimensions and mass properties of the hub, main shaft, main bearing(s), gearbox, bedplate, transformer if up-tower, and yaw system. The level of fi¬ delity for each component varies depending on whether semiempirical parametric or physics-based models are used. The physics-based models have internal iteration schemes based on system constraints and design criteria. Every model is validated against available industry data or finite-element analysis. The verification and validation results show that the models reasonablymore » capture primary drivers for the sizing and design of major drivetrain components.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, J. H.; Ng, E. Y. K.; Robertson, Amy

    As part of a collaboration of the National Renewable Energy Laboratory (NREL) and SWAY AS, NREL installed scientific wind, wave, and motion measurement equipment on the spar-type 1/6.5th-scale prototype SWAY floating offshore wind system. The equipment enhanced SWAY's data collection and allowed SWAY to verify the concept and NREL to validate a FAST model of the SWAY design in an open-water condition. Nanyang Technological University (NTU), in collaboration with NREL, assisted with the validation. This final report gives an overview of the SWAY prototype and NREL and NTU's efforts to validate a model of the system. The report provides amore » summary of the different software tools used in the study, the modeling strategies, and the development of a FAST model of the SWAY prototype wind turbine, including justification of the modeling assumptions. Because of uncertainty in system parameters and modeling assumptions due to the complexity of the design, several system properties were tuned to better represent the system and improve the accuracy of the simulations. Calibration was performed using data from a static equilibrium test and free-decay tests.« less

  12. Performance Evaluation of a Data Validation System

    NASA Technical Reports Server (NTRS)

    Wong, Edmond (Technical Monitor); Sowers, T. Shane; Santi, L. Michael; Bickford, Randall L.

    2005-01-01

    Online data validation is a performance-enhancing component of modern control and health management systems. It is essential that performance of the data validation system be verified prior to its use in a control and health management system. A new Data Qualification and Validation (DQV) Test-bed application was developed to provide a systematic test environment for this performance verification. The DQV Test-bed was used to evaluate a model-based data validation package known as the Data Quality Validation Studio (DQVS). DQVS was employed as the primary data validation component of a rocket engine health management (EHM) system developed under NASA's NGLT (Next Generation Launch Technology) program. In this paper, the DQVS and DQV Test-bed software applications are described, and the DQV Test-bed verification procedure for this EHM system application is presented. Test-bed results are summarized and implications for EHM system performance improvements are discussed.

  13. Tools for Evaluating Fault Detection and Diagnostic Methods for HVAC Secondary Systems

    NASA Astrophysics Data System (ADS)

    Pourarian, Shokouh

    Although modern buildings are using increasingly sophisticated energy management and control systems that have tremendous control and monitoring capabilities, building systems routinely fail to perform as designed. More advanced building control, operation, and automated fault detection and diagnosis (AFDD) technologies are needed to achieve the goal of net-zero energy commercial buildings. Much effort has been devoted to develop such technologies for primary heating ventilating and air conditioning (HVAC) systems, and some secondary systems. However, secondary systems, such as fan coil units and dual duct systems, although widely used in commercial, industrial, and multifamily residential buildings, have received very little attention. This research study aims at developing tools that could provide simulation capabilities to develop and evaluate advanced control, operation, and AFDD technologies for these less studied secondary systems. In this study, HVACSIM+ is selected as the simulation environment. Besides developing dynamic models for the above-mentioned secondary systems, two other issues related to the HVACSIM+ environment are also investigated. One issue is the nonlinear equation solver used in HVACSIM+ (Powell's Hybrid method in subroutine SNSQ). It has been found from several previous research projects (ASRHAE RP 825 and 1312) that SNSQ is especially unstable at the beginning of a simulation and sometimes unable to converge to a solution. Another issue is related to the zone model in the HVACSIM+ library of components. Dynamic simulation of secondary HVAC systems unavoidably requires an interacting zone model which is systematically and dynamically interacting with building surrounding. Therefore, the accuracy and reliability of the building zone model affects operational data generated by the developed dynamic tool to predict HVAC secondary systems function. The available model does not simulate the impact of direct solar radiation that enters a zone through glazing and the study of zone model is conducted in this direction to modify the existing zone model. In this research project, the following tasks are completed and summarized in this report: 1. Develop dynamic simulation models in the HVACSIM+ environment for common fan coil unit and dual duct system configurations. The developed simulation models are able to produce both fault-free and faulty operational data under a wide variety of faults and severity levels for advanced control, operation, and AFDD technology development and evaluation purposes; 2. Develop a model structure, which includes the grouping of blocks and superblocks, treatment of state variables, initial and boundary conditions, and selection of equation solver, that can simulate a dual duct system efficiently with satisfactory stability; 3. Design and conduct a comprehensive and systematic validation procedure using collected experimental data to validate the developed simulation models under both fault-free and faulty operational conditions; 4. Conduct a numerical study to compare two solution techniques: Powell's Hybrid (PH) and Levenberg-Marquardt (LM) in terms of their robustness and accuracy. 5. Modification of the thermal state of the existing building zone model in HVACSIM+ library of component. This component is revised to consider the transmitted heat through glazing as a heat source for transient building zone load prediction In this report, literature, including existing HVAC dynamic modeling environment and models, HVAC model validation methodologies, and fault modeling and validation methodologies, are reviewed. The overall methodologies used for fault free and fault model development and validation are introduced. Detailed model development and validation results for the two secondary systems, i.e., fan coil unit and dual duct system are summarized. Experimental data mostly from the Iowa Energy Center Energy Resource Station are used to validate the models developed in this project. Satisfactory model performance in both fault free and fault simulation studies is observed for all studied systems.

  14. Similarity Metrics for Closed Loop Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Whorton, Mark S.; Yang, Lee C.; Bedrossian, Naz; Hall, Robert A.

    2008-01-01

    To what extent and in what ways can two closed-loop dynamic systems be said to be "similar?" This question arises in a wide range of dynamic systems modeling and control system design applications. For example, bounds on error models are fundamental to the controller optimization with modern control design methods. Metrics such as the structured singular value are direct measures of the degree to which properties such as stability or performance are maintained in the presence of specified uncertainties or variations in the plant model. Similarly, controls-related areas such as system identification, model reduction, and experimental model validation employ measures of similarity between multiple realizations of a dynamic system. Each area has its tools and approaches, with each tool more or less suited for one application or the other. Similarity in the context of closed-loop model validation via flight test is subtly different from error measures in the typical controls oriented application. Whereas similarity in a robust control context relates to plant variation and the attendant affect on stability and performance, in this context similarity metrics are sought that assess the relevance of a dynamic system test for the purpose of validating the stability and performance of a "similar" dynamic system. Similarity in the context of system identification is much more relevant than are robust control analogies in that errors between one dynamic system (the test article) and another (the nominal "design" model) are sought for the purpose of bounding the validity of a model for control design and analysis. Yet system identification typically involves open-loop plant models which are independent of the control system (with the exception of limited developments in closed-loop system identification which is nonetheless focused on obtaining open-loop plant models from closed-loop data). Moreover the objectives of system identification are not the same as a flight test and hence system identification error metrics are not directly relevant. In applications such as launch vehicles where the open loop plant is unstable it is similarity of the closed-loop system dynamics of a flight test that are relevant.

  15. A Bayesian framework for adaptive selection, calibration, and validation of coarse-grained models of atomistic systems

    NASA Astrophysics Data System (ADS)

    Farrell, Kathryn; Oden, J. Tinsley; Faghihi, Danial

    2015-08-01

    A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.

  16. Documentation and Validation of the Goddard Earth Observing System (GEOS) Data Assimilation System, Version 4

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); daSilva, Arlindo; Dee, Dick; Bloom, Stephen; Bosilovich, Michael; Pawson, Steven; Schubert, Siegfried; Wu, Man-Li; Sienkiewicz, Meta; Stajner, Ivanka

    2005-01-01

    This document describes the structure and validation of a frozen version of the Goddard Earth Observing System Data Assimilation System (GEOS DAS): GEOS-4.0.3. Significant features of GEOS-4 include: version 3 of the Community Climate Model (CCM3) with the addition of a finite volume dynamical core; version two of the Community Land Model (CLM2); the Physical-space Statistical Analysis System (PSAS); and an interactive retrieval system (iRET) for assimilating TOVS radiance data. Upon completion of the GEOS-4 validation in December 2003, GEOS-4 became operational on 15 January 2004. Products from GEOS-4 have been used in supporting field campaigns and for reprocessing several years of data for CERES.

  17. LADAR Performance Simulations with a High Spectral Resolution Atmospheric Transmittance and Radiance Model-LEEDR

    DTIC Science & Technology

    2012-03-01

    such as FASCODE is accomplished. The assessment is limited by the correctness of the models used; validating the models is beyond the scope of this...comparisons with other models and validation against data sets (Snell et al. 2000). 2.3.2 Previous Research Several LADAR simulations have been produced...performance models would better capture the atmosphere physics and climatological effects on these systems. Also, further validation needs to be performed

  18. Development and validation of a nursing professionalism evaluation model in a career ladder system.

    PubMed

    Kim, Yeon Hee; Jung, Young Sun; Min, Ja; Song, Eun Young; Ok, Jung Hui; Lim, Changwon; Kim, Kyunghee; Kim, Ji-Su

    2017-01-01

    The clinical ladder system categorizes the degree of nursing professionalism and rewards and is an important human resource tool for managing nursing. We developed a model to evaluate nursing professionalism, which determines the clinical ladder system levels, and verified its validity. Data were collected using a clinical competence tool developed in this study, and existing methods such as the nursing professionalism evaluation tool, peer reviews, and face-to-face interviews to evaluate promotions and verify the presented content in a medical institution. Reliability and convergent and discriminant validity of the clinical competence evaluation tool were verified using SmartPLS software. The validity of the model for evaluating overall nursing professionalism was also analyzed. Clinical competence was determined by five dimensions of nursing practice: scientific, technical, ethical, aesthetic, and existential. The structural model explained 66% of the variance. Clinical competence scales, peer reviews, and face-to-face interviews directly determined nursing professionalism levels. The evaluation system can be used for evaluating nurses' professionalism in actual medical institutions from a nursing practice perspective. A conceptual framework for establishing a human resources management system for nurses and a tool for evaluating nursing professionalism at medical institutions is provided.

  19. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    NASA Astrophysics Data System (ADS)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  20. Skill assessment of the coupled physical-biogeochemical operational Mediterranean Forecasting System

    NASA Astrophysics Data System (ADS)

    Cossarini, Gianpiero; Clementi, Emanuela; Salon, Stefano; Grandi, Alessandro; Bolzon, Giorgio; Solidoro, Cosimo

    2016-04-01

    The Mediterranean Monitoring and Forecasting Centre (Med-MFC) is one of the regional production centres of the European Marine Environment Monitoring Service (CMEMS-Copernicus). Med-MFC operatively manages a suite of numerical model systems (3DVAR-NEMO-WW3 and 3DVAR-OGSTM-BFM) that provides gridded datasets of physical and biogeochemical variables for the Mediterranean marine environment with a horizontal resolution of about 6.5 km. At the present stage, the operational Med-MFC produces ten-day forecast: daily for physical parameters and bi-weekly for biogeochemical variables. The validation of the coupled model system and the estimate of the accuracy of model products are key issues to ensure reliable information to the users and the downstream services. Product quality activities at Med-MFC consist of two levels of validation and skill analysis procedures. Pre-operational qualification activities focus on testing the improvement of the quality of a new release of the model system and relays on past simulation and historical data. Then, near real time (NRT) validation activities aim at the routinely and on-line skill assessment of the model forecast and relays on the NRT available observations. Med-MFC validation framework uses both independent (i.e. Bio-Argo float data, in-situ mooring and vessel data of oxygen, nutrients and chlorophyll, moored buoys, tide-gauges and ADCP of temperature, salinity, sea level and velocity) and semi-independent data (i.e. data already used for assimilation, such as satellite chlorophyll, Satellite SLA and SST and in situ vertical profiles of temperature and salinity from XBT, Argo and Gliders) We give evidence that different variables (e.g. CMEMS-products) can be validated at different levels (i.e. at the forecast level or at the level of model consistency) and at different spatial and temporal scales. The fundamental physical parameters temperature, salinity and sea level are routinely validated on daily, weekly and quarterly base at regional and sub-regional scale and along specific vertical layers (temperature and salinity); while velocity fields are daily validated against in situ coastal moorings. Since the velocity skill cannot be accurately assessed through coastal measurements due to the actual model horizontal resolution (~6.5 km), new validation metrics and procedures are under investigation. Chlorophyll is the only biogeochemical variable that can be validated routinely at the temporal and spatial scale of the weekly forecast, while nutrients and oxygen predictions can be validated locally or at sub-basin and seasonal scales. For the other biogeochemical variables (i.e. primary production, carbonate system variables) only the accuracy of the average dynamics and model consistency can be evaluated. Then, we discuss the limiting factors of the present validation framework, and the quality and extension of the observing system that would be needed for improving the reliability of the physical and biogeochemical Mediterranean forecast services.

  1. Parametric model of servo-hydraulic actuator coupled with a nonlinear system: Experimental validation

    NASA Astrophysics Data System (ADS)

    Maghareh, Amin; Silva, Christian E.; Dyke, Shirley J.

    2018-05-01

    Hydraulic actuators play a key role in experimental structural dynamics. In a previous study, a physics-based model for a servo-hydraulic actuator coupled with a nonlinear physical system was developed. Later, this dynamical model was transformed into controllable canonical form for position tracking control purposes. For this study, a nonlinear device is designed and fabricated to exhibit various nonlinear force-displacement profiles depending on the initial condition and the type of materials used as replaceable coupons. Using this nonlinear system, the controllable canonical dynamical model is experimentally validated for a servo-hydraulic actuator coupled with a nonlinear physical system.

  2. Design and landing dynamic analysis of reusable landing leg for a near-space manned capsule

    NASA Astrophysics Data System (ADS)

    Yue, Shuai; Nie, Hong; Zhang, Ming; Wei, Xiaohui; Gan, Shengyong

    2018-06-01

    To improve the landing performance of a near-space manned capsule under various landing conditions, a novel landing system is designed that employs double chamber and single chamber dampers in the primary and auxiliary struts, respectively. A dynamic model of the landing system is established, and the damper parameters are determined by employing the design method. A single-leg drop test with different initial pitch angles is then conducted to compare and validate the simulation model. Based on the validated simulation model, seven critical landing conditions regarding nine crucial landing responses are found by combining the radial basis function (RBF) surrogate model and adaptive simulated annealing (ASA) optimization method. Subsequently, the adaptability of the landing system under critical landing conditions is analyzed. The results show that the simulation effectively results match the test results, which validates the accuracy of the dynamic model. In addition, all of the crucial responses under their corresponding critical landing conditions satisfy the design specifications, demonstrating the feasibility of the landing system.

  3. Mathematical multi-scale model of the cardiovascular system including mitral valve dynamics. Application to ischemic mitral insufficiency

    PubMed Central

    2011-01-01

    Background Valve dysfunction is a common cardiovascular pathology. Despite significant clinical research, there is little formal study of how valve dysfunction affects overall circulatory dynamics. Validated models would offer the ability to better understand these dynamics and thus optimize diagnosis, as well as surgical and other interventions. Methods A cardiovascular and circulatory system (CVS) model has already been validated in silico, and in several animal model studies. It accounts for valve dynamics using Heaviside functions to simulate a physiologically accurate "open on pressure, close on flow" law. However, it does not consider real-time valve opening dynamics and therefore does not fully capture valve dysfunction, particularly where the dysfunction involves partial closure. This research describes an updated version of this previous closed-loop CVS model that includes the progressive opening of the mitral valve, and is defined over the full cardiac cycle. Results Simulations of the cardiovascular system with healthy mitral valve are performed, and, the global hemodynamic behaviour is studied compared with previously validated results. The error between resulting pressure-volume (PV) loops of already validated CVS model and the new CVS model that includes the progressive opening of the mitral valve is assessed and remains within typical measurement error and variability. Simulations of ischemic mitral insufficiency are also performed. Pressure-Volume loops, transmitral flow evolution and mitral valve aperture area evolution follow reported measurements in shape, amplitude and trends. Conclusions The resulting cardiovascular system model including mitral valve dynamics provides a foundation for clinical validation and the study of valvular dysfunction in vivo. The overall models and results could readily be generalised to other cardiac valves. PMID:21942971

  4. Topological characterization versus synchronization for assessing (or not) dynamical equivalence

    NASA Astrophysics Data System (ADS)

    Letellier, Christophe; Mangiarotti, Sylvain; Sendiña-Nadal, Irene; Rössler, Otto E.

    2018-04-01

    Model validation from experimental data is an important and not trivial topic which is too often reduced to a simple visual inspection of the state portrait spanned by the variables of the system. Synchronization was suggested as a possible technique for model validation. By means of a topological analysis, we revisited this concept with the help of an abstract chemical reaction system and data from two electrodissolution experiments conducted by Jack Hudson's group. The fact that it was possible to synchronize topologically different global models led us to conclude that synchronization is not a recommendable technique for model validation. A short historical preamble evokes Jack Hudson's early career in interaction with Otto E. Rössler.

  5. Active imaging system performance model for target acquisition

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Teaney, Brian; Nguyen, Quang; Jacobs, Eddie L.; Halford, Carl E.; Tofsted, David H.

    2007-04-01

    The U.S. Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate has developed a laser-range-gated imaging system performance model for the detection, recognition, and identification of vehicle targets. The model is based on the established US Army RDECOM CERDEC NVESD sensor performance models of the human system response through an imaging system. The Java-based model, called NVLRG, accounts for the effect of active illumination, atmospheric attenuation, and turbulence effects relevant to LRG imagers, such as speckle and scintillation, and for the critical sensor and display components. This model can be used to assess the performance of recently proposed active SWIR systems through various trade studies. This paper will describe the NVLRG model in detail, discuss the validation of recent model components, present initial trade study results, and outline plans to validate and calibrate the end-to-end model with field data through human perception testing.

  6. The Facial Expression Coding System (FACES): Development, Validation, and Utility

    ERIC Educational Resources Information Center

    Kring, Ann M.; Sloan, Denise M.

    2007-01-01

    This article presents information on the development and validation of the Facial Expression Coding System (FACES; A. M. Kring & D. Sloan, 1991). Grounded in a dimensional model of emotion, FACES provides information on the valence (positive, negative) of facial expressive behavior. In 5 studies, reliability and validity data from 13 diverse…

  7. Fluid–Structure Interaction Analysis of Papillary Muscle Forces Using a Comprehensive Mitral Valve Model with 3D Chordal Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toma, Milan; Jensen, Morten Ø.; Einstein, Daniel R.

    2015-07-17

    Numerical models of native heart valves are being used to study valve biomechanics to aid design and development of repair procedures and replacement devices. These models have evolved from simple two-dimensional approximations to complex three-dimensional, fully coupled fluid-structure interaction (FSI) systems. Such simulations are useful for predicting the mechanical and hemodynamic loading on implanted valve devices. A current challenge for improving the accuracy of these predictions is choosing and implementing modeling boundary conditions. In order to address this challenge, we are utilizing an advanced in-vitro system to validate FSI conditions for the mitral valve system. Explanted ovine mitral valves weremore » mounted in an in vitro setup, and structural data for the mitral valve was acquired with *CT. Experimental data from the in-vitro ovine mitral valve system were used to validate the computational model. As the valve closes, the hemodynamic data, high speed lea et dynamics, and force vectors from the in-vitro system were compared to the results of the FSI simulation computational model. The total force of 2.6 N per papillary muscle is matched by the computational model. In vitro and in vivo force measurements are important in validating and adjusting material parameters in computational models. The simulations can then be used to answer questions that are otherwise not possible to investigate experimentally. This work is important to maximize the validity of computational models of not just the mitral valve, but any biomechanical aspect using computational simulation in designing medical devices.« less

  8. Fluid-Structure Interaction Analysis of Papillary Muscle Forces Using a Comprehensive Mitral Valve Model with 3D Chordal Structure.

    PubMed

    Toma, Milan; Jensen, Morten Ø; Einstein, Daniel R; Yoganathan, Ajit P; Cochran, Richard P; Kunzelman, Karyn S

    2016-04-01

    Numerical models of native heart valves are being used to study valve biomechanics to aid design and development of repair procedures and replacement devices. These models have evolved from simple two-dimensional approximations to complex three-dimensional, fully coupled fluid-structure interaction (FSI) systems. Such simulations are useful for predicting the mechanical and hemodynamic loading on implanted valve devices. A current challenge for improving the accuracy of these predictions is choosing and implementing modeling boundary conditions. In order to address this challenge, we are utilizing an advanced in vitro system to validate FSI conditions for the mitral valve system. Explanted ovine mitral valves were mounted in an in vitro setup, and structural data for the mitral valve was acquired with [Formula: see text]CT. Experimental data from the in vitro ovine mitral valve system were used to validate the computational model. As the valve closes, the hemodynamic data, high speed leaflet dynamics, and force vectors from the in vitro system were compared to the results of the FSI simulation computational model. The total force of 2.6 N per papillary muscle is matched by the computational model. In vitro and in vivo force measurements enable validating and adjusting material parameters to improve the accuracy of computational models. The simulations can then be used to answer questions that are otherwise not possible to investigate experimentally. This work is important to maximize the validity of computational models of not just the mitral valve, but any biomechanical aspect using computational simulation in designing medical devices.

  9. IMPACT: a generic tool for modelling and simulating public health policy.

    PubMed

    Ainsworth, J D; Carruthers, E; Couch, P; Green, N; O'Flaherty, M; Sperrin, M; Williams, R; Asghar, Z; Capewell, S; Buchan, I E

    2011-01-01

    Populations are under-served by local health policies and management of resources. This partly reflects a lack of realistically complex models to enable appraisal of a wide range of potential options. Rising computing power coupled with advances in machine learning and healthcare information now enables such models to be constructed and executed. However, such models are not generally accessible to public health practitioners who often lack the requisite technical knowledge or skills. To design and develop a system for creating, executing and analysing the results of simulated public health and healthcare policy interventions, in ways that are accessible and usable by modellers and policy-makers. The system requirements were captured and analysed in parallel with the statistical method development for the simulation engine. From the resulting software requirement specification the system architecture was designed, implemented and tested. A model for Coronary Heart Disease (CHD) was created and validated against empirical data. The system was successfully used to create and validate the CHD model. The initial validation results show concordance between the simulation results and the empirical data. We have demonstrated the ability to connect health policy-modellers and policy-makers in a unified system, thereby making population health models easier to share, maintain, reuse and deploy.

  10. Prognostics of Power Electronics, Methods and Validation Experiments

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai

    2012-01-01

    Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.

  11. Validation of GC and HPLC systems for residue studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, M.

    1995-12-01

    For residue studies, GC and HPLC system performance must be validated prior to and during use. One excellent measure of system performance is the standard curve and associated chromatograms used to construct that curve. The standard curve is a model of system response to an analyte over a specific time period, and is prima facia evidence of system performance beginning at the auto sampler and proceeding through the injector, column, detector, electronics, data-capture device, and printer/plotter. This tool measures the performance of the entire chromatographic system; its power negates most of the benefits associated with costly and time-consuming validation ofmore » individual system components. Other measures of instrument and method validation will be discussed, including quality control charts and experimental designs for method validation.« less

  12. Data-Driven Residential Load Modeling and Validation in GridLAB-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gotseff, Peter; Lundstrom, Blake

    Accurately characterizing the impacts of high penetrations of distributed energy resources (DER) on the electric distribution system has driven modeling methods from traditional static snap shots, often representing a critical point in time (e.g., summer peak load), to quasi-static time series (QSTS) simulations capturing all the effects of variable DER, associated controls and hence, impacts on the distribution system over a given time period. Unfortunately, the high time resolution DER source and load data required for model inputs is often scarce or non-existent. This paper presents work performed within the GridLAB-D model environment to synthesize, calibrate, and validate 1-second residentialmore » load models based on measured transformer loads and physics-based models suitable for QSTS electric distribution system modeling. The modeling and validation approach taken was to create a typical GridLAB-D model home that, when replicated to represent multiple diverse houses on a single transformer, creates a statistically similar load to a measured load for a given weather input. The model homes are constructed to represent the range of actual homes on an instrumented transformer: square footage, thermal integrity, heating and cooling system definition as well as realistic occupancy schedules. House model calibration and validation was performed using the distribution transformer load data and corresponding weather. The modeled loads were found to be similar to the measured loads for four evaluation metrics: 1) daily average energy, 2) daily average and standard deviation of power, 3) power spectral density, and 4) load shape.« less

  13. Towards Validation of an Adaptive Flight Control Simulation Using Statistical Emulation

    NASA Technical Reports Server (NTRS)

    He, Yuning; Lee, Herbert K. H.; Davies, Misty D.

    2012-01-01

    Traditional validation of flight control systems is based primarily upon empirical testing. Empirical testing is sufficient for simple systems in which a.) the behavior is approximately linear and b.) humans are in-the-loop and responsible for off-nominal flight regimes. A different possible concept of operation is to use adaptive flight control systems with online learning neural networks (OLNNs) in combination with a human pilot for off-nominal flight behavior (such as when a plane has been damaged). Validating these systems is difficult because the controller is changing during the flight in a nonlinear way, and because the pilot and the control system have the potential to co-adapt in adverse ways traditional empirical methods are unlikely to provide any guarantees in this case. Additionally, the time it takes to find unsafe regions within the flight envelope using empirical testing means that the time between adaptive controller design iterations is large. This paper describes a new concept for validating adaptive control systems using methods based on Bayesian statistics. This validation framework allows the analyst to build nonlinear models with modal behavior, and to have an uncertainty estimate for the difference between the behaviors of the model and system under test.

  14. Incremental checking of Master Data Management model based on contextual graphs

    NASA Astrophysics Data System (ADS)

    Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan

    2015-10-01

    The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.

  15. Calibration and validation of coarse-grained models of atomic systems: application to semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Farrell, Kathryn; Oden, J. Tinsley

    2014-07-01

    Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and methods through applications to representative atomic structures and we discuss extensions to the validation process for molecular models of polymer structures encountered in certain semiconductor nanomanufacturing processes. The powerful method of model plausibility as a means for selecting interaction potentials for coarse-grained models is discussed in connection with a coarse-grained hexane molecule. Discussions of how all-atom information is used to construct priors are contained in an appendix.

  16. Models and applications for space weather forecasting and analysis at the Community Coordinated Modeling Center.

    NASA Astrophysics Data System (ADS)

    Kuznetsova, Maria

    The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) was established at the dawn of the new millennium as a long-term flexible solution to the problem of transition of progress in space environment modeling to operational space weather forecasting. CCMC hosts an expanding collection of state-of-the-art space weather models developed by the international space science community. Over the years the CCMC acquired the unique experience in preparing complex models and model chains for operational environment and developing and maintaining custom displays and powerful web-based systems and tools ready to be used by researchers, space weather service providers and decision makers. In support of space weather needs of NASA users CCMC is developing highly-tailored applications and services that target specific orbits or locations in space and partnering with NASA mission specialists on linking CCMC space environment modeling with impacts on biological and technological systems in space. Confidence assessment of model predictions is an essential element of space environment modeling. CCMC facilitates interaction between model owners and users in defining physical parameters and metrics formats relevant to specific applications and leads community efforts to quantify models ability to simulate and predict space environment events. Interactive on-line model validation systems developed at CCMC make validation a seamless part of model development circle. The talk will showcase innovative solutions for space weather research, validation, anomaly analysis and forecasting and review on-going community-wide model validation initiatives enabled by CCMC applications.

  17. Experimental Validation of a Closed Brayton Cycle System Transient Simulation

    NASA Technical Reports Server (NTRS)

    Johnson, Paul K.; Hervol, David S.

    2006-01-01

    The Brayton Power Conversion Unit (BPCU) located at NASA Glenn Research Center (GRC) in Cleveland, Ohio was used to validate the results of a computational code known as Closed Cycle System Simulation (CCSS). Conversion system thermal transient behavior was the focus of this validation. The BPCU was operated at various steady state points and then subjected to transient changes involving shaft rotational speed and thermal energy input. These conditions were then duplicated in CCSS. Validation of the CCSS BPCU model provides confidence in developing future Brayton power system performance predictions, and helps to guide high power Brayton technology development.

  18. A scoring system to predict breast cancer mortality at 5 and 10 years.

    PubMed

    Paredes-Aracil, Esther; Palazón-Bru, Antonio; Folgado-de la Rosa, David Manuel; Ots-Gutiérrez, José Ramón; Compañ-Rosique, Antonio Fernando; Gil-Guillén, Vicente Francisco

    2017-03-24

    Although predictive models exist for mortality in breast cancer (BC) (generally all cause-mortality), they are not applicable to all patients and their statistical methodology is not the most powerful to develop a predictive model. Consequently, we developed a predictive model specific for BC mortality at 5 and 10 years resolving the above issues. This cohort study included 287 patients diagnosed with BC in a Spanish region in 2003-2016. time-to-BC death. Secondary variables: age, personal history of breast surgery, personal history of any cancer/BC, premenopause, postmenopause, grade, estrogen receptor, progesterone receptor, c-erbB2, TNM stage, multicentricity/multifocality, diagnosis and treatment. A points system was constructed to predict BC mortality at 5 and 10 years. The model was internally validated by bootstrapping. The points system was integrated into a mobile application for Android. Mean follow-up was 8.6 ± 3.5 years and 55 patients died of BC. The points system included age, personal history of BC, grade, TNM stage and multicentricity. Validation was satisfactory, in both discrimination and calibration. In conclusion, we constructed and internally validated a scoring system for predicting BC mortality at 5 and 10 years. External validation studies are needed for its use in other geographical areas.

  19. Development, calibration, and validation of performance prediction models for the Texas M-E flexible pavement design system.

    DOT National Transportation Integrated Search

    2010-08-01

    This study was intended to recommend future directions for the development of TxDOTs Mechanistic-Empirical : (TexME) design system. For stress predictions, a multi-layer linear elastic system was evaluated and its validity was : verified by compar...

  20. An RL10A-3-3A rocket engine model using the rocket engine transient simulator (ROCETS) software

    NASA Technical Reports Server (NTRS)

    Binder, Michael

    1993-01-01

    Steady-state and transient computer models of the RL10A-3-3A rocket engine have been created using the Rocket Engine Transient Simulation (ROCETS) code. These models were created for several purposes. The RL10 engine is a critical component of past, present, and future space missions; the model will give NASA an in-house capability to simulate the performance of the engine under various operating conditions and mission profiles. The RL10 simulation activity is also an opportunity to further validate the ROCETS program. The ROCETS code is an important tool for modeling rocket engine systems at NASA Lewis. ROCETS provides a modular and general framework for simulating the steady-state and transient behavior of any desired propulsion system. Although the ROCETS code is being used in a number of different analysis and design projects within NASA, it has not been extensively validated for any system using actual test data. The RL10A-3-3A has a ten year history of test and flight applications; it should provide sufficient data to validate the ROCETS program capability. The ROCETS models of the RL10 system were created using design information provided by Pratt & Whitney, the engine manufacturer. These models are in the process of being validated using test-stand and flight data. This paper includes a brief description of the models and comparison of preliminary simulation output against flight and test-stand data.

  1. Experimental Validation of a Thermoelastic Model for SMA Hybrid Composites

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.

    2001-01-01

    This study presents results from experimental validation of a recently developed model for predicting the thermomechanical behavior of shape memory alloy hybrid composite (SMAHC) structures, composite structures with an embedded SMA constituent. The model captures the material nonlinearity of the material system with temperature and is capable of modeling constrained, restrained, or free recovery behavior from experimental measurement of fundamental engineering properties. A brief description of the model and analysis procedures is given, followed by an overview of a parallel effort to fabricate and characterize the material system of SMAHC specimens. Static and dynamic experimental configurations for the SMAHC specimens are described and experimental results for thermal post-buckling and random response are presented. Excellent agreement is achieved between the measured and predicted results, fully validating the theoretical model for constrained recovery behavior of SMAHC structures.

  2. Identification of patients at high risk for Clostridium difficile infection: development and validation of a risk prediction model in hospitalized patients treated with antibiotics.

    PubMed

    van Werkhoven, C H; van der Tempel, J; Jajou, R; Thijsen, S F T; Diepersloot, R J A; Bonten, M J M; Postma, D F; Oosterheert, J J

    2015-08-01

    To develop and validate a prediction model for Clostridium difficile infection (CDI) in hospitalized patients treated with systemic antibiotics, we performed a case-cohort study in a tertiary (derivation) and secondary care hospital (validation). Cases had a positive Clostridium test and were treated with systemic antibiotics before suspicion of CDI. Controls were randomly selected from hospitalized patients treated with systemic antibiotics. Potential predictors were selected from the literature. Logistic regression was used to derive the model. Discrimination and calibration of the model were tested in internal and external validation. A total of 180 cases and 330 controls were included for derivation. Age >65 years, recent hospitalization, CDI history, malignancy, chronic renal failure, use of immunosuppressants, receipt of antibiotics before admission, nonsurgical admission, admission to the intensive care unit, gastric tube feeding, treatment with cephalosporins and presence of an underlying infection were independent predictors of CDI. The area under the receiver operating characteristic curve of the model in the derivation cohort was 0.84 (95% confidence interval 0.80-0.87), and was reduced to 0.81 after internal validation. In external validation, consisting of 97 cases and 417 controls, the model area under the curve was 0.81 (95% confidence interval 0.77-0.85) and model calibration was adequate (Brier score 0.004). A simplified risk score was derived. Using a cutoff of 7 points, the positive predictive value, sensitivity and specificity were 1.0%, 72% and 73%, respectively. In conclusion, a risk prediction model was developed and validated, with good discrimination and calibration, that can be used to target preventive interventions in patients with increased risk of CDI. Copyright © 2015 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  3. NREL, EPRI Validate Advanced Microgrid Controller with ESIF's Virtual

    Science.gov Websites

    Microgrid Controller with ESIF's Virtual Microgrid Model NREL, EPRI Validate Advanced Microgrid Controller with ESIF's Virtual Microgrid Model NREL is working with the Electric Power Research Institute (EPRI Energy Systems Integration Facility, by connecting it to a virtual model of a microgrid. NREL researchers

  4. Institutional Effectiveness: A Model for Planning, Assessment & Validation.

    ERIC Educational Resources Information Center

    Truckee Meadows Community Coll., Sparks, NV.

    The report presents Truckee Meadows Community College's (Colorado) model for assessing institutional effectiveness and validating the College's mission and vision, and the strategic plan for carrying out the institutional effectiveness model. It also outlines strategic goals for the years 1999-2001. From the system-wide directive that education…

  5. Quality evaluation of health information system's architectures developed using the HIS-DF methodology.

    PubMed

    López, Diego M; Blobel, Bernd; Gonzalez, Carolina

    2010-01-01

    Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.

  6. Validation of PV-RPM Code in the System Advisor Model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Geoffrey Taylor; Lavrova, Olga; Freeman, Janine

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whethermore » the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.« less

  7. OC5 Project Phase II: Validation of Global Loads of the DeepCwind Floating Semisubmersible Wind Turbine

    DOE PAGES

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...

    2017-10-01

    This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems through the comparison of simulated responses of select system designs to physical test data. Validation activities such as these lead to improvement of offshore wind modeling tools, which will enable the development of more innovative and cost-effective offshore wind designs. For Phase II of the project, numerical models of the DeepCwind floating semisubmersible wind system weremore » validated using measurement data from a 1/50th-scale validation campaign performed at the Maritime Research Institute Netherlands offshore wave basin. Validation of the models was performed by comparing the calculated ultimate and fatigue loads for eight different wave-only and combined wind/wave test cases against the measured data, after calibration was performed using free-decay, wind-only, and wave-only tests. The results show a decent estimation of both the ultimate and fatigue loads for the simulated results, but with a fairly consistent underestimation in the tower and upwind mooring line loads that can be attributed to an underestimation of wave-excitation forces outside the linear wave-excitation region, and the presence of broadband frequency excitation in the experimental measurements from wind. Participant results showed varied agreement with the experimental measurements based on the modeling approach used. Modeling attributes that enabled better agreement included: the use of a dynamic mooring model; wave stretching, or some other hydrodynamic modeling approach that excites frequencies outside the linear wave region; nonlinear wave kinematics models; and unsteady aerodynamics models. Also, it was observed that a Morison-only hydrodynamic modeling approach could create excessive pitch excitation and resulting tower loads in some frequency bands.« less

  8. OC5 Project Phase II: Validation of Global Loads of the DeepCwind Floating Semisubmersible Wind Turbine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.

    This paper summarizes the findings from Phase II of the Offshore Code Comparison, Collaboration, Continued, with Correlation project. The project is run under the International Energy Agency Wind Research Task 30, and is focused on validating the tools used for modeling offshore wind systems through the comparison of simulated responses of select system designs to physical test data. Validation activities such as these lead to improvement of offshore wind modeling tools, which will enable the development of more innovative and cost-effective offshore wind designs. For Phase II of the project, numerical models of the DeepCwind floating semisubmersible wind system weremore » validated using measurement data from a 1/50th-scale validation campaign performed at the Maritime Research Institute Netherlands offshore wave basin. Validation of the models was performed by comparing the calculated ultimate and fatigue loads for eight different wave-only and combined wind/wave test cases against the measured data, after calibration was performed using free-decay, wind-only, and wave-only tests. The results show a decent estimation of both the ultimate and fatigue loads for the simulated results, but with a fairly consistent underestimation in the tower and upwind mooring line loads that can be attributed to an underestimation of wave-excitation forces outside the linear wave-excitation region, and the presence of broadband frequency excitation in the experimental measurements from wind. Participant results showed varied agreement with the experimental measurements based on the modeling approach used. Modeling attributes that enabled better agreement included: the use of a dynamic mooring model; wave stretching, or some other hydrodynamic modeling approach that excites frequencies outside the linear wave region; nonlinear wave kinematics models; and unsteady aerodynamics models. Also, it was observed that a Morison-only hydrodynamic modeling approach could create excessive pitch excitation and resulting tower loads in some frequency bands.« less

  9. Scoring and staging systems using cox linear regression modeling and recursive partitioning.

    PubMed

    Lee, J W; Um, S H; Lee, J B; Mun, J; Cho, H

    2006-01-01

    Scoring and staging systems are used to determine the order and class of data according to predictors. Systems used for medical data, such as the Child-Turcotte-Pugh scoring and staging systems for ordering and classifying patients with liver disease, are often derived strictly from physicians' experience and intuition. We construct objective and data-based scoring/staging systems using statistical methods. We consider Cox linear regression modeling and recursive partitioning techniques for censored survival data. In particular, to obtain a target number of stages we propose cross-validation and amalgamation algorithms. We also propose an algorithm for constructing scoring and staging systems by integrating local Cox linear regression models into recursive partitioning, so that we can retain the merits of both methods such as superior predictive accuracy, ease of use, and detection of interactions between predictors. The staging system construction algorithms are compared by cross-validation evaluation of real data. The data-based cross-validation comparison shows that Cox linear regression modeling is somewhat better than recursive partitioning when there are only continuous predictors, while recursive partitioning is better when there are significant categorical predictors. The proposed local Cox linear recursive partitioning has better predictive accuracy than Cox linear modeling and simple recursive partitioning. This study indicates that integrating local linear modeling into recursive partitioning can significantly improve prediction accuracy in constructing scoring and staging systems.

  10. Establishment of a VISAR Measurement System for Material Model Validation in DSTO

    DTIC Science & Technology

    2013-02-01

    advancements published in the works by L.M. Baker, E.R. Hollenbach and W.F. Hemsing [1-3] and results in the user-friendly interface and configuration of the...VISAR system [4] used in the current work . VISAR tests are among the mandatory instrumentation techniques when validating material models and...The present work reports on preliminary tests using the recently commissioned DSTO VISAR system, providing an assessment of the experimental set-up

  11. Validation of a physically based catchment model for application in post-closure radiological safety assessments of deep geological repositories for solid radioactive wastes.

    PubMed

    Thorne, M C; Degnan, P; Ewen, J; Parkin, G

    2000-12-01

    The physically based river catchment modelling system SHETRAN incorporates components representing water flow, sediment transport and radionuclide transport both in solution and bound to sediments. The system has been applied to simulate hypothetical future catchments in the context of post-closure radiological safety assessments of a potential site for a deep geological disposal facility for intermediate and certain low-level radioactive wastes at Sellafield, west Cumbria. In order to have confidence in the application of SHETRAN for this purpose, various blind validation studies have been undertaken. In earlier studies, the validation was undertaken against uncertainty bounds in model output predictions set by the modelling team on the basis of how well they expected the model to perform. However, validation can also be carried out with bounds set on the basis of how well the model is required to perform in order to constitute a useful assessment tool. Herein, such an assessment-based validation exercise is reported. This exercise related to a field plot experiment conducted at Calder Hollow, west Cumbria, in which the migration of strontium and lanthanum in subsurface Quaternary deposits was studied on a length scale of a few metres. Blind predictions of tracer migration were compared with experimental results using bounds set by a small group of assessment experts independent of the modelling team. Overall, the SHETRAN system performed well, failing only two out of seven of the imposed tests. Furthermore, of the five tests that were not failed, three were positively passed even when a pessimistic view was taken as to how measurement errors should be taken into account. It is concluded that the SHETRAN system, which is still being developed further, is a powerful tool for application in post-closure radiological safety assessments.

  12. Two-Speed Gearbox Dynamic Simulation Predictions and Test Validation

    NASA Technical Reports Server (NTRS)

    Lewicki, David G.; DeSmidt, Hans; Smith, Edward C.; Bauman, Steven W.

    2010-01-01

    Dynamic simulations and experimental validation tests were performed on a two-stage, two-speed gearbox as part of the drive system research activities of the NASA Fundamental Aeronautics Subsonics Rotary Wing Project. The gearbox was driven by two electromagnetic motors and had two electromagnetic, multi-disk clutches to control output speed. A dynamic model of the system was created which included a direct current electric motor with proportional-integral-derivative (PID) speed control, a two-speed gearbox with dual electromagnetically actuated clutches, and an eddy current dynamometer. A six degree-of-freedom model of the gearbox accounted for the system torsional dynamics and included gear, clutch, shaft, and load inertias as well as shaft flexibilities and a dry clutch stick-slip friction model. Experimental validation tests were performed on the gearbox in the NASA Glenn gear noise test facility. Gearbox output speed and torque as well as drive motor speed and current were compared to those from the analytical predictions. The experiments correlate very well with the predictions, thus validating the dynamic simulation methodologies.

  13. LIVVkit 2: An extensible land ice verification and validation toolkit for comparing observations and models?

    NASA Astrophysics Data System (ADS)

    Kennedy, J. H.; Bennett, A. R.; Evans, K. J.; Fyke, J. G.; Vargo, L.; Price, S. F.; Hoffman, M. J.

    2016-12-01

    Accurate representation of ice sheets and glaciers are essential for robust predictions of arctic climate within Earth System models. Verification and Validation (V&V) is a set of techniques used to quantify the correctness and accuracy of a model, which builds developer/modeler confidence, and can be used to enhance the credibility of the model. Fundamentally, V&V is a continuous process because each model change requires a new round of V&V testing. The Community Ice Sheet Model (CISM) development community is actively developing LIVVkit, the Land Ice Verification and Validation toolkit, which is designed to easily integrate into an ice-sheet model's development workflow (on both personal and high-performance computers) to provide continuous V&V testing.LIVVkit is a robust and extensible python package for V&V, which has components for both software V&V (construction and use) and model V&V (mathematics and physics). The model Verification component is used, for example, to verify model results against community intercomparisons such as ISMIP-HOM. The model validation component is used, for example, to generate a series of diagnostic plots showing the differences between model results against observations for variables such as thickness, surface elevation, basal topography, surface velocity, surface mass balance, etc. Because many different ice-sheet models are under active development, new validation datasets are becoming available, and new methods of analysing these models are actively being researched, LIVVkit includes a framework to easily extend the model V&V analyses by ice-sheet modelers. This allows modelers and developers to develop evaluations of parameters, implement changes, and quickly see how those changes effect the ice-sheet model and earth system model (when coupled). Furthermore, LIVVkit outputs a portable hierarchical website allowing evaluations to be easily shared, published, and analysed throughout the arctic and Earth system communities.

  14. Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.

    We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less

  15. Validation of Bioreactor and Human-on-a-Chip Devices for Chemical Safety Assessment.

    PubMed

    Rebelo, Sofia P; Dehne, Eva-Maria; Brito, Catarina; Horland, Reyk; Alves, Paula M; Marx, Uwe

    2016-01-01

    Equipment and device qualification and test assay validation in the field of tissue engineered human organs for substance assessment remain formidable tasks with only a few successful examples so far. The hurdles seem to increase with the growing complexity of the biological systems, emulated by the respective models. Controlled single tissue or organ culture in bioreactors improves the organ-specific functions and maintains their phenotypic stability for longer periods of time. The reproducibility attained with bioreactor operations is, per se, an advantage for the validation of safety assessment. Regulatory agencies have gradually altered the validation concept from exhaustive "product" to rigorous and detailed process characterization, valuing reproducibility as a standard for validation. "Human-on-a-chip" technologies applying micro-physiological systems to the in vitro combination of miniaturized human organ equivalents into functional human micro-organisms are nowadays thought to be the most elaborate solution created to date. They target the replacement of the current most complex models-laboratory animals. Therefore, we provide here a road map towards the validation of such "human-on-a-chip" models and qualification of their respective bioreactor and microchip equipment along a path currently used for the respective animal models.

  16. Helicopter simulation validation using flight data

    NASA Technical Reports Server (NTRS)

    Key, D. L.; Hansen, R. S.; Cleveland, W. B.; Abbott, W. Y.

    1982-01-01

    A joint NASA/Army effort to perform a systematic ground-based piloted simulation validation assessment is described. The best available mathematical model for the subject helicopter (UH-60A Black Hawk) was programmed for real-time operation. Flight data were obtained to validate the math model, and to develop models for the pilot control strategy while performing mission-type tasks. The validated math model is to be combined with motion and visual systems to perform ground based simulation. Comparisons of the control strategy obtained in flight with that obtained on the simulator are to be used as the basis for assessing the fidelity of the results obtained in the simulator.

  17. Impact of Cross-Axis Structural Dynamics on Validation of Linear Models for Space Launch System

    NASA Technical Reports Server (NTRS)

    Pei, Jing; Derry, Stephen D.; Zhou Zhiqiang; Newsom, Jerry R.

    2014-01-01

    A feasibility study was performed to examine the advisability of incorporating a set of Programmed Test Inputs (PTIs) during the Space Launch System (SLS) vehicle flight. The intent of these inputs is to provide validation to the preflight models for control system stability margins, aerodynamics, and structural dynamics. During October 2009, Ares I-X program was successful in carrying out a series of PTI maneuvers which provided a significant amount of valuable data for post-flight analysis. The resulting data comparisons showed excellent agreement with the preflight linear models across the frequency spectrum of interest. However unlike Ares I-X, the structural dynamics associated with the SLS boost phase configuration are far more complex and highly coupled in all three axes. This presents a challenge when implementing this similar system identification technique to SLS. Preliminary simulation results show noticeable mismatches between PTI validation and analytical linear models in the frequency range of the structural dynamics. An alternate approach was examined which demonstrates the potential for better overall characterization of the system frequency response as well as robustness of the control design.

  18. Modeling Terrorism Risk to the Air Transportation System: An Independent Assessment of TSA’s Risk Management Analysis Tool and Associated Methods

    DTIC Science & Technology

    2012-01-01

    our own work for this discussion. DoD Instruction 5000.61 defines model validation as “the pro - cess of determining the degree to which a model and its... determined that RMAT is highly con - crete code, potentially leading to redundancies in the code itself and making RMAT more difficult to maintain...system con - ceptual models valid, and are the data used to support them adequate? (Chapters Two and Three) 2. Are the sources and methods for populating

  19. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expandedmore » to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  20. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    PubMed Central

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold

    2016-01-01

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian’s kmc. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems. PMID:27370123

  1. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model.

    PubMed

    Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold

    2016-07-01

    The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.

  2. System Identification Methods for Aircraft Flight Control Development and Validation

    DOT National Transportation Integrated Search

    1995-10-01

    System-identification methods compose a mathematical model, or series of models, : from measurements of inputs and outputs of dynamic systems. This paper : discusses the use of frequency-domain system-identification methods for the : development and ...

  3. Evaluation of MuSyQ land surface albedo based on LAnd surface Parameters VAlidation System (LAPVAS)

    NASA Astrophysics Data System (ADS)

    Dou, B.; Wen, J.; Xinwen, L.; Zhiming, F.; Wu, S.; Zhang, Y.

    2016-12-01

    satellite derived Land surface albedo is an essential climate variable which controls the earth energy budget and it can be used in applications such as climate change, hydrology, and numerical weather prediction. However, the accuracy and uncertainty of surface albedo products should be evaluated with a reliable reference truth data prior to applications. A new comprehensive and systemic project of china, called the Remote Sensing Application Network (CRSAN), has been launched recent years. Two subjects of this project is developing a Multi-source data Synergized Quantitative Remote Sensin g Production System ( MuSyQ ) and a Web-based validation system named LAnd surface remote sensing Product VAlidation System (LAPVAS) , which aims to generate a quantitative remote sensing product for ecosystem and environmental monitoring and validate them with a reference validation data and a standard validation system, respectively. Land surface BRDF/albedo is one of product datasets of MuSyQ which has a pentad period with 1km spatial resolution and is derived by Multi-sensor Combined BRDF Inversion ( MCBI ) Model. In this MuSyQ albedo evaluation, a multi-validation strategy is implemented by LAPVAS, including directly and multi-scale validation with field measured albedo and cross validation with MODIS albedo product with different land cover. The results reveal that MuSyQ albedo data with a 5-day temporal resolution is in higher sensibility and accuracy during land cover change period, e.g. snowing. But results without regard to snow or changed land cover, MuSyQ albedo generally is in similar accuracy with MODIS albedo and meet the climate modeling requirement of an absolute accuracy of 0.05.

  4. Validation of a Scalable Solar Sailcraft

    NASA Technical Reports Server (NTRS)

    Murphy, D. M.

    2006-01-01

    The NASA In-Space Propulsion (ISP) program sponsored intensive solar sail technology and systems design, development, and hardware demonstration activities over the past 3 years. Efforts to validate a scalable solar sail system by functional demonstration in relevant environments, together with test-analysis correlation activities on a scalable solar sail system have recently been successfully completed. A review of the program, with descriptions of the design, results of testing, and analytical model validations of component and assembly functional, strength, stiffness, shape, and dynamic behavior are discussed. The scaled performance of the validated system is projected to demonstrate the applicability to flight demonstration and important NASA road-map missions.

  5. Combat Simulation Using Breach Computer Language

    DTIC Science & Technology

    1979-09-01

    simulation and weapon system analysis computer language Two types of models were constructed: a stochastic duel and a dynamic engagement model The... duel model validates the BREACH approach by comparing results with mathematical solutions. The dynamic model shows the capability of the BREACH...BREACH 2 Background 2 The Language 3 Static Duel 4 Background and Methodology 4 Validation 5 Results 8 Tank Duel Simulation 8 Dynamic Assault Model

  6. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  7. Numerical modelling of transdermal delivery from matrix systems: parametric study and experimental validation with silicone matrices.

    PubMed

    Snorradóttir, Bergthóra S; Jónsdóttir, Fjóla; Sigurdsson, Sven Th; Másson, Már

    2014-08-01

    A model is presented for transdermal drug delivery from single-layered silicone matrix systems. The work is based on our previous results that, in particular, extend the well-known Higuchi model. Recently, we have introduced a numerical transient model describing matrix systems where the drug dissolution can be non-instantaneous. Furthermore, our model can describe complex interactions within a multi-layered matrix and the matrix to skin boundary. The power of the modelling approach presented here is further illustrated by allowing the possibility of a donor solution. The model is validated by a comparison with experimental data, as well as validating the parameter values against each other, using various configurations with donor solution, silicone matrix and skin. Our results show that the model is a good approximation to real multi-layered delivery systems. The model offers the ability of comparing drug release for ibuprofen and diclofenac, which cannot be analysed by the Higuchi model because the dissolution in the latter case turns out to be limited. The experiments and numerical model outlined in this study could also be adjusted to more general formulations, which enhances the utility of the numerical model as a design tool for the development of drug-loaded matrices for trans-membrane and transdermal delivery. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  8. Using Lunar Observations to Validate In-Flight Calibrations of Clouds and Earth Radiant Energy System Instruments

    NASA Technical Reports Server (NTRS)

    Daniels, Janet L.; Smith, G. Louis; Priestley, Kory J.; Thomas, Susan

    2014-01-01

    The validation of in-orbit instrument performance requires stability in both instrument and calibration source. This paper describes a method of validation using lunar observations scanning near full moon by the Clouds and Earth Radiant Energy System (CERES) instruments. Unlike internal calibrations, the Moon offers an external source whose signal variance is predictable and non-degrading. From 2006 to present, in-orbit observations have become standardized and compiled for the Flight Models-1 and -2 aboard the Terra satellite, for Flight Models-3 and -4 aboard the Aqua satellite, and beginning 2012, for Flight Model-5 aboard Suomi-NPP. Instrument performance parameters which can be gleaned are detector gain, pointing accuracy and static detector point response function validation. Lunar observations are used to examine the stability of all three detectors on each of these instruments from 2006 to present. This validation method has yielded results showing trends per CERES data channel of 1.2% per decade or less.

  9. Comparison and validation of acoustic response models for wind noise reduction pipe arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marty, Julien; Denis, Stéphane; Gabrielson, Thomas

    The detection capability of the infrasound component of the International Monitoring System (IMS) is tightly linked to the performance of its wind noise reduction systems. The wind noise reduction solution implemented at all IMS infrasound measurement systems consists of a spatial distribution of air inlets connected to the infrasound sensor through a network of pipes. This system, usually referred to as “pipe array,” has proven its efficiency in operational conditions. The objective of this paper is to present the results of the comparison and validation of three distinct acoustic response models for pipe arrays. The characteristics of the models andmore » the results obtained for a defined set of pipe array configurations are described. A field experiment using a newly developed infrasound generator, dedicated to the validation of these models, is then presented. The comparison between the modeled and empirical acoustic responses shows that two of the three models can be confidently used to estimate pipe array acoustic responses. Lastly, this study paves the way to the deconvolution of IMS infrasound data from pipe array responses and to the optimization of pipe array design to IMS applications.« less

  10. Comparison and validation of acoustic response models for wind noise reduction pipe arrays

    DOE PAGES

    Marty, Julien; Denis, Stéphane; Gabrielson, Thomas; ...

    2017-02-13

    The detection capability of the infrasound component of the International Monitoring System (IMS) is tightly linked to the performance of its wind noise reduction systems. The wind noise reduction solution implemented at all IMS infrasound measurement systems consists of a spatial distribution of air inlets connected to the infrasound sensor through a network of pipes. This system, usually referred to as “pipe array,” has proven its efficiency in operational conditions. The objective of this paper is to present the results of the comparison and validation of three distinct acoustic response models for pipe arrays. The characteristics of the models andmore » the results obtained for a defined set of pipe array configurations are described. A field experiment using a newly developed infrasound generator, dedicated to the validation of these models, is then presented. The comparison between the modeled and empirical acoustic responses shows that two of the three models can be confidently used to estimate pipe array acoustic responses. Lastly, this study paves the way to the deconvolution of IMS infrasound data from pipe array responses and to the optimization of pipe array design to IMS applications.« less

  11. Examining construct and predictive validity of the Health-IT Usability Evaluation Scale: confirmatory factor analysis and structural equation modeling results

    PubMed Central

    Yen, Po-Yin; Sousa, Karen H; Bakken, Suzanne

    2014-01-01

    Background In a previous study, we developed the Health Information Technology Usability Evaluation Scale (Health-ITUES), which is designed to support customization at the item level. Such customization matches the specific tasks/expectations of a health IT system while retaining comparability at the construct level, and provides evidence of its factorial validity and internal consistency reliability through exploratory factor analysis. Objective In this study, we advanced the development of Health-ITUES to examine its construct validity and predictive validity. Methods The health IT system studied was a web-based communication system that supported nurse staffing and scheduling. Using Health-ITUES, we conducted a cross-sectional study to evaluate users’ perception toward the web-based communication system after system implementation. We examined Health-ITUES's construct validity through first and second order confirmatory factor analysis (CFA), and its predictive validity via structural equation modeling (SEM). Results The sample comprised 541 staff nurses in two healthcare organizations. The CFA (n=165) showed that a general usability factor accounted for 78.1%, 93.4%, 51.0%, and 39.9% of the explained variance in ‘Quality of Work Life’, ‘Perceived Usefulness’, ‘Perceived Ease of Use’, and ‘User Control’, respectively. The SEM (n=541) supported the predictive validity of Health-ITUES, explaining 64% of the variance in intention for system use. Conclusions The results of CFA and SEM provide additional evidence for the construct and predictive validity of Health-ITUES. The customizability of Health-ITUES has the potential to support comparisons at the construct level, while allowing variation at the item level. We also illustrate application of Health-ITUES across stages of system development. PMID:24567081

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainer, Leo I.; Hoeschele, Marc A.; Apte, Michael G.

    This report addresses the results of detailed monitoring completed under Program Element 6 of Lawrence Berkeley National Laboratory's High Performance Commercial Building Systems (HPCBS) PIER program. The purpose of the Energy Simulations and Projected State-Wide Energy Savings project is to develop reasonable energy performance and cost models for high performance relocatable classrooms (RCs) across California climates. A key objective of the energy monitoring was to validate DOE2 simulations for comparison to initial DOE2 performance projections. The validated DOE2 model was then used to develop statewide savings projections by modeling base case and high performance RC operation in the 16 Californiamore » climate zones. The primary objective of this phase of work was to utilize detailed field monitoring data to modify DOE2 inputs and generate performance projections based on a validated simulation model. Additional objectives include the following: (1) Obtain comparative performance data on base case and high performance HVAC systems to determine how they are operated, how they perform, and how the occupants respond to the advanced systems. This was accomplished by installing both HVAC systems side-by-side (i.e., one per module of a standard two module, 24 ft by 40 ft RC) on the study RCs and switching HVAC operating modes on a weekly basis. (2) Develop projected statewide energy and demand impacts based on the validated DOE2 model. (3) Develop cost effectiveness projections for the high performance HVAC system in the 16 California climate zones.« less

  13. An integrated radar model solution for mission level performance and cost trades

    NASA Astrophysics Data System (ADS)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  14. Development and validation of an automated delirium risk assessment system (Auto-DelRAS) implemented in the electronic health record system.

    PubMed

    Moon, Kyoung-Ja; Jin, Yinji; Jin, Taixian; Lee, Sun-Mi

    2018-01-01

    A key component of the delirium management is prevention and early detection. To develop an automated delirium risk assessment system (Auto-DelRAS) that automatically alerts health care providers of an intensive care unit (ICU) patient's delirium risk based only on data collected in an electronic health record (EHR) system, and to evaluate the clinical validity of this system. Cohort and system development designs were used. Medical and surgical ICUs in two university hospitals in Seoul, Korea. A total of 3284 patients for the development of Auto-DelRAS, 325 for external validation, 694 for validation after clinical applications. The 4211 data items were extracted from the EHR system and delirium was measured using CAM-ICU (Confusion Assessment Method for Intensive Care Unit). The potential predictors were selected and a logistic regression model was established to create a delirium risk scoring algorithm to construct the Auto-DelRAS. The Auto-DelRAS was evaluated at three months and one year after its application to clinical practice to establish the predictive validity of the system. Eleven predictors were finally included in the logistic regression model. The results of the Auto-DelRAS risk assessment were shown as high/moderate/low risk on a Kardex screen. The predictive validity, analyzed after the clinical application of Auto-DelRAS after one year, showed a sensitivity of 0.88, specificity of 0.72, positive predictive value of 0.53, negative predictive value of 0.94, and a Youden index of 0.59. A relatively high level of predictive validity was maintained with the Auto-DelRAS system, even one year after it was applied to clinical practice. Copyright © 2017. Published by Elsevier Ltd.

  15. Warfighter IT Interoperability Standards Study

    DTIC Science & Technology

    2012-07-22

    data (e.g. messages) between systems ? ii) What process did you used to validate and certify semantic interoperability between your...other systems at this time There was no requirement to validate and certify semantic interoperability The DLS program exchanges data with... semantics Testing for System Compliance with Data Models Verify and Certify Interoperability Using Data

  16. OC5 Project Phase Ib: Validation of hydrodynamic loading on a fixed, flexible cylinder for offshore wind applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.

    This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less

  17. OC5 Project Phase Ib: Validation of hydrodynamic loading on a fixed, flexible cylinder for offshore wind applications

    DOE PAGES

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...

    2016-10-13

    This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less

  18. Model-Based Thermal System Design Optimization for the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-01-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  19. Model-based thermal system design optimization for the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-10-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  20. Simulation of daily streamflows at gaged and ungaged locations within the Cedar River Basin, Iowa, using a Precipitation-Runoff Modeling System model

    USGS Publications Warehouse

    Christiansen, Daniel E.

    2012-01-01

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, conducted a study to examine techniques for estimation of daily streamflows using hydrological models and statistical methods. This report focuses on the use of a hydrologic model, the U.S. Geological Survey's Precipitation-Runoff Modeling System, to estimate daily streamflows at gaged and ungaged locations. The Precipitation-Runoff Modeling System is a modular, physically based, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on surface-water runoff and general basin hydrology. The Cedar River Basin was selected to construct a Precipitation-Runoff Modeling System model that simulates the period from January 1, 2000, to December 31, 2010. The calibration period was from January 1, 2000, to December 31, 2004, and the validation periods were from January 1, 2005, to December 31, 2010 and January 1, 2000 to December 31, 2010. A Geographic Information System tool was used to delineate the Cedar River Basin and subbasins for the Precipitation-Runoff Modeling System model and to derive parameters based on the physical geographical features. Calibration of the Precipitation-Runoff Modeling System model was completed using a U.S. Geological Survey calibration software tool. The main objective of the calibration was to match the daily streamflow simulated by the Precipitation-Runoff Modeling System model with streamflow measured at U.S. Geological Survey streamflow gages. The Cedar River Basin daily streamflow model performed with a Nash-Sutcliffe efficiency ranged from 0.82 to 0.33 during the calibration period, and a Nash-Sutcliffe efficiency ranged from 0.77 to -0.04 during the validation period. The Cedar River Basin model is meeting the criteria of greater than 0.50 Nash-Sutcliffe and is a good fit for streamflow conditions for the calibration period at all but one location, Austin, Minnesota. The Precipitation-Runoff Modeling System model accurately simulated streamflow at four of six uncalibrated sites within the basin. Overall, there was good agreement between simulated and measured seasonal and annual volumes throughout the basin for calibration and validation sites. The calibration period ranged from 0.2 to 20.8 percent difference, and the validation period ranged from 0.0 to 19.5 percent difference across all seasons and total annual runoff. The Precipitation-Runoff Modeling System model tended to underestimate lower streamflows compared to the observed streamflow values. This is an indication that the Precipitation-Runoff Modeling model needs more detailed groundwater and storage information to properly model the low-flow conditions in the Cedar River Basin.

  1. Demonstration of successful malaria forecasts for Botswana using an operational seasonal climate model

    NASA Astrophysics Data System (ADS)

    MacLeod, Dave A.; Jones, Anne; Di Giuseppe, Francesca; Caminade, Cyril; Morse, Andrew P.

    2015-04-01

    The severity and timing of seasonal malaria epidemics is strongly linked with temperature and rainfall. Advance warning of meteorological conditions from seasonal climate models can therefore potentially anticipate unusually strong epidemic events, building resilience and adapting to possible changes in the frequency of such events. Here we present validation of a process-based, dynamic malaria model driven by hindcasts from a state-of-the-art seasonal climate model from the European Centre for Medium-Range Weather Forecasts. We validate the climate and malaria models against observed meteorological and incidence data for Botswana over the period 1982-2006 the longest record of observed incidence data which has been used to validate a modeling system of this kind. We consider the impact of climate model biases, the relationship between climate and epidemiological predictability and the potential for skillful malaria forecasts. Forecast skill is demonstrated for upper tercile malaria incidence for the Botswana malaria season (January-May), using forecasts issued at the start of November; the forecast system anticipates six out of the seven upper tercile malaria seasons in the observational period. The length of the validation time series gives confidence in the conclusion that it is possible to make reliable forecasts of seasonal malaria risk, forming a key part of a health early warning system for Botswana and contributing to efforts to adapt to climate change.

  2. Integrating technologies for oil spill response in the SW Iberian coast

    NASA Astrophysics Data System (ADS)

    Janeiro, J.; Neves, A.; Martins, F.; Relvas, P.

    2017-09-01

    An operational oil spill modelling system developed for the SW Iberia Coast is used to investigate the relative importance of the different components and technologies integrating an oil spill monitoring and response structure. A backtrack of a CleanSeaNet oil detection in the region is used to demonstrate the concept. Taking advantage of regional operational products available, the system provides the necessary resolution to go from regional to coastal scales using a downscalling approach, while a multi-grid methodology allows the based oil spill model to span across model domains taking full advantage of the increasing resolution between the model grids. An extensive validation procedure using a multiplicity of sensors, with good spatial and temporal coverage, strengthens the operational system ability to accurately solve coastal scale processes. The model is validated using available trajectories from satellite-tracked drifters. Finally, a methodology is proposed to identifying potential origins for the CleanSeaNet oil detection, by combining model backtrack results with ship trajectories supplied by AIS was developed, including the error estimations found in the backtrack validation.

  3. Validation of Salinity Data from the Soil Moisture and Ocean Salinity (SMOS) and Aquarius Satellites in the Agulhas Current System

    NASA Astrophysics Data System (ADS)

    Button, N.

    2016-02-01

    The Agulhas Current System is an important western boundary current, particularly due to its vital role in the transport of heat and salt from the Indian Ocean to the Atlantic Ocean, such as through Agulhas rings. Accurate measurements of salinity are necessary for assessing the role of the Agulhas Current System and these rings in the global climate system are necessary. With ESA's Soil Moisture and Ocean Salinity (SMOS) and NASA's Aquarius/SAC-D satellites, we now have complete spatial and temporal (since 2009 and 2011, respectively) coverage of salinity data. To use this data to understand the role of the Agulhas Current System in the context of salinity within the global climate system, we must first understand validate the satellite data using in situ and model comparisons. In situ comparisons are important because of the accuracy, but they lack in the spatial and temporal coverage to validate the satellite data. For example, there are approximately 100 floats in the Agulhas Return Current. Therefore, model comparisons, such as the Hybrid Coordinate Ocean Model (HYCOM), are used along with the in situ data for the validation. For the validation, the satellite data, Argo float data, and HYCOM simulations were compared within box regions both inside and outside of the Agulhas Current. These boxed regions include the main Agulhas Current, Agulhas Return Current, Agulhas Retroflection, and Agulhas rings, as well as a low salinity and high salinity region outside of the current system. This analysis reveals the accuracy of the salinity measurements from the Aquarius/SAC-D and SMOS satellites within the Agulhas Current, which then provides accurate salinity data that can then be used to understand the role of the Agulhas Current System in the global climate system.

  4. Analysis of propulsion system dynamics in the validation of a high-order state space model of the UH-60

    NASA Technical Reports Server (NTRS)

    Kim, Frederick D.

    1992-01-01

    Frequency responses generated from a high-order linear model of the UH-60 Black Hawk have shown that the propulsion system influences significantly the vertical and yaw dynamics of the aircraft at frequencies important to high-bandwidth control law designs. The inclusion of the propulsion system comprises the latest step in the development of a high-order linear model of the UH-60 that models additionally the dynamics of the fuselage, rotor, and inflow. A complete validation study of the linear model is presented in the frequency domain for both on-axis and off-axis coupled responses in the hoverflight condition, and on-axis responses for forward speeds of 80 and 120 knots.

  5. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  6. Predeployment validation of fault-tolerant systems through software-implemented fault insertion

    NASA Technical Reports Server (NTRS)

    Czeck, Edward W.; Siewiorek, Daniel P.; Segall, Zary Z.

    1989-01-01

    Fault injection-based automated testing (FIAT) environment, which can be used to experimentally characterize and evaluate distributed realtime systems under fault-free and faulted conditions is described. A survey is presented of validation methodologies. The need for fault insertion based on validation methodologies is demonstrated. The origins and models of faults, and motivation for the FIAT concept are reviewed. FIAT employs a validation methodology which builds confidence in the system through first providing a baseline of fault-free performance data and then characterizing the behavior of the system with faults present. Fault insertion is accomplished through software and allows faults or the manifestation of faults to be inserted by either seeding faults into memory or triggering error detection mechanisms. FIAT is capable of emulating a variety of fault-tolerant strategies and architectures, can monitor system activity, and can automatically orchestrate experiments involving insertion of faults. There is a common system interface which allows ease of use to decrease experiment development and run time. Fault models chosen for experiments on FIAT have generated system responses which parallel those observed in real systems under faulty conditions. These capabilities are shown by two example experiments each using a different fault-tolerance strategy.

  7. Verifying Stability of Dynamic Soft-Computing Systems

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  8. Comparison of 3D dynamic virtual model to link segment model for estimation of net L4/L5 reaction moments during lifting.

    PubMed

    Abdoli-Eramaki, Mohammad; Stevenson, Joan M; Agnew, Michael J; Kamalzadeh, Amin

    2009-04-01

    The purpose of this study was to validate a 3D dynamic virtual model for lifting tasks against a validated link segment model (LSM). A face validation study was conducted by collecting x, y, z coordinate data and using them in both virtual and LSM models. An upper body virtual model was needed to calculate the 3D torques about human joints for use in simulated lifting styles and to estimate the effect of external mechanical devices on human body. Firstly, the model had to be validated to be sure it provided accurate estimates of 3D moments in comparison to a previously validated LSM. Three synchronised Fastrak units with nine sensors were used to record data from one male subject who completed dynamic box lifting under 27 different load conditions (box weights (3), lifting techniques (3) and rotations (3)). The external moments about three axes of L4/L5 were compared for both models. A pressure switch on the box was used to denote the start and end of the lift. An excellent agreement [image omitted] was found between the two models for dynamic lifting tasks, especially for larger moments in flexion and extension. This virtual model was considered valid for use in a complete simulation of the upper body skeletal system. This biomechanical virtual model of the musculoskeletal system can be used by researchers and practitioners to give a better tool to study the causes of LBP and the effect of intervention strategies, by permitting the researcher to see and control a virtual subject's motions.

  9. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software

    NASA Astrophysics Data System (ADS)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.

    2017-09-01

    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.

  10. FAST Model Calibration and Validation of the OC5-DeepCwind Floating Offshore Wind System Against Wave Tank Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Robertson, Amy N; Jonkman, Jason

    During the course of the Offshore Code Comparison Collaboration, Continued, with Correlation (OC5) project, which focused on the validation of numerical methods through comparison against tank test data, the authors created a numerical FAST model of the 1:50-scale DeepCwind semisubmersible system that was tested at the Maritime Research Institute Netherlands ocean basin in 2013. This paper discusses several model calibration studies that were conducted to identify model adjustments that improve the agreement between the numerical simulations and the experimental test data. These calibration studies cover wind-field-specific parameters (coherence, turbulence), hydrodynamic and aerodynamic modeling approaches, as well as rotor model (blade-pitchmore » and blade-mass imbalances) and tower model (structural tower damping coefficient) adjustments. These calibration studies were conducted based on relatively simple calibration load cases (wave only/wind only). The agreement between the final FAST model and experimental measurements is then assessed based on more-complex combined wind and wave validation cases.« less

  11. Discrete and continuous dynamics modeling of a mass moving on a flexible structure

    NASA Technical Reports Server (NTRS)

    Herman, Deborah Ann

    1992-01-01

    A general discrete methodology for modeling the dynamics of a mass that moves on the surface of a flexible structure is developed. This problem was motivated by the Space Station/Mobile Transporter system. A model reduction approach is developed to make the methodology applicable to large structural systems. To validate the discrete methodology, continuous formulations are also developed. Three different systems are examined: (1) simply-supported beam, (2) free-free beam, and (3) free-free beam with two points of contact between the mass and the flexible beam. In addition to validating the methodology, parametric studies were performed to examine how the system's physical properties affect its dynamics.

  12. Modeling complex treatment strategies: construction and validation of a discrete event simulation model for glaucoma.

    PubMed

    van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G

    2010-01-01

    Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.

  13. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  14. A new system model for radar polarimeters

    NASA Technical Reports Server (NTRS)

    Freeman, Anthony

    1991-01-01

    The validity of the 2 x 2 receive R and transmit T model for radar polarimeter systems, first proposed by Zebker et al. (1987), is questioned. The model is found to be invalid for many practical realizations of radar polarimeters, which can lead to significant errors in the calibration of polarimetric radar images. A more general model is put forward, which addresses the system defects which cause the 2 x 2 model to break down. By measuring one simple parameter from a polarimetric active radar calibration (PARC), it is possible to transform the scattering matrix measurements made by a radar polarimeter to a format compatible with a 2 x 2 R and T matrix model. Alternatively, the PARC can be used to verify the validity of the 2 x 2 model for any polarimetric radar system. Recommendations for the use of PARCs in polarimetric calibration and to measure the orientation angle of the horizontal (H) and vertical (V) coordinate system are also presented.

  15. A new system model for radar polarimeters

    NASA Astrophysics Data System (ADS)

    Freeman, Anthony

    1991-09-01

    The validity of the 2 x 2 receive R and transmit T model for radar polarimeter systems, first proposed by Zebker et al. (1987), is questioned. The model is found to be invalid for many practical realizations of radar polarimeters, which can lead to significant errors in the calibration of polarimetric radar images. A more general model is put forward, which addresses the system defects which cause the 2 x 2 model to break down. By measuring one simple parameter from a polarimetric active radar calibration (PARC), it is possible to transform the scattering matrix measurements made by a radar polarimeter to a format compatible with a 2 x 2 R and T matrix model. Alternatively, the PARC can be used to verify the validity of the 2 x 2 model for any polarimetric radar system. Recommendations for the use of PARCs in polarimetric calibration and to measure the orientation angle of the horizontal (H) and vertical (V) coordinate system are also presented.

  16. Real-time remote scientific model validation

    NASA Technical Reports Server (NTRS)

    Frainier, Richard; Groleau, Nicolas

    1994-01-01

    This paper describes flight results from the use of a CLIPS-based validation facility to compare analyzed data from a space life sciences (SLS) experiment to an investigator's preflight model. The comparison, performed in real-time, either confirms or refutes the model and its predictions. This result then becomes the basis for continuing or modifying the investigator's experiment protocol. Typically, neither the astronaut crew in Spacelab nor the ground-based investigator team are able to react to their experiment data in real time. This facility, part of a larger science advisor system called Principal Investigator in a Box, was flown on the space shuttle in October, 1993. The software system aided the conduct of a human vestibular physiology experiment and was able to outperform humans in the tasks of data integrity assurance, data analysis, and scientific model validation. Of twelve preflight hypotheses associated with investigator's model, seven were confirmed and five were rejected or compromised.

  17. Monte Carlo simulation of Ray-Scan 64 PET system and performance evaluation using GATE toolkit

    NASA Astrophysics Data System (ADS)

    Li, Suying; Zhang, Qiushi; Vuletic, Ivan; Xie, Zhaoheng; Yang, Kun; Ren, Qiushi

    2017-02-01

    In this study, we aimed to develop a GATE model for the simulation of Ray-Scan 64 PET scanner and model its performance characteristics. A detailed implementation of system geometry and physical process were included in the simulation model. Then we modeled the performance characteristics of Ray-Scan 64 PET system for the first time, based on National Electrical Manufacturers Association (NEMA) NU-2 2007 protocols and validated the model against experimental measurement, including spatial resolution, sensitivity, counting rates and noise equivalent count rate (NECR). Moreover, an accurate dead time module was investigated to simulate the counting rate performance. Overall results showed reasonable agreement between simulation and experimental data. The validation results showed the reliability and feasibility of the GATE model to evaluate major performance of Ray-Scan 64 PET system. It provided a useful tool for a wide range of research applications.

  18. Jason Jonkman | NREL

    Science.gov Websites

    -based and offshore wind turbines. He also guides projects aimed at verifying, validating, and applying developing, verifying, and validating simulation models for offshore wind turbines. He is the principal investigator for a DOE-funded project to improve the modeling of offshore floating wind system dynamics. He

  19. A validation procedure for a LADAR system radiometric simulation model

    NASA Astrophysics Data System (ADS)

    Leishman, Brad; Budge, Scott; Pack, Robert

    2007-04-01

    The USU LadarSIM software package is a ladar system engineering tool that has recently been enhanced to include the modeling of the radiometry of Ladar beam footprints. This paper will discuss our validation of the radiometric model and present a practical approach to future validation work. In order to validate complicated and interrelated factors affecting radiometry, a systematic approach had to be developed. Data for known parameters were first gathered then unknown parameters of the system were determined from simulation test scenarios. This was done in a way to isolate as many unknown variables as possible, then build on the previously obtained results. First, the appropriate voltage threshold levels of the discrimination electronics were set by analyzing the number of false alarms seen in actual data sets. With this threshold set, the system noise was then adjusted to achieve the appropriate number of dropouts. Once a suitable noise level was found, the range errors of the simulated and actual data sets were compared and studied. Predicted errors in range measurements were analyzed using two methods: first by examining the range error of a surface with known reflectivity and second by examining the range errors for specific detectors with known responsivities. This provided insight into the discrimination method and receiver electronics used in the actual system.

  20. Developing and Validating the Socio-Technical Model in Ontology Engineering

    NASA Astrophysics Data System (ADS)

    Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin

    2018-03-01

    This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.

  1. Control Activity in Support of NASA Turbine Based Combined Cycle (TBCC) Research

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Vrnak, Daniel R.; Le, Dzu K.; Ouzts, Peter J.

    2010-01-01

    Control research for a Turbine Based Combined Cycle (TBCC) propulsion system is the current focus of the Hypersonic Guidance, Navigation, and Control (GN&C) discipline team. The ongoing work at the NASA Glenn Research Center (GRC) supports the Hypersonic GN&C effort in developing tools to aid the design of control algorithms to manage a TBCC airbreathing propulsion system during a critical operating period. The critical operating period being addressed in this paper is the span when the propulsion system transitions from one cycle to another, referred to as mode transition. One such tool, that is a basic need for control system design activities, is computational models (hereto forth referred to as models) of the propulsion system. The models of interest for designing and testing controllers are Control Development Models (CDMs) and Control Validation Models (CVMs). CDMs and CVMs are needed for each of the following propulsion system elements: inlet, turbine engine, ram/scram dual-mode combustor, and nozzle. This paper presents an overall architecture for a TBCC propulsion system model that includes all of the propulsion system elements. Efforts are under way, focusing on one of the propulsion system elements, to develop CDMs and CVMs for a TBCC propulsion system inlet. The TBCC inlet aerodynamic design being modeled is that of the Combined-Cycle Engine (CCE) Testbed. The CCE Testbed is a large-scale model of an aerodynamic design that was verified in a small-scale screening experiment. The modeling approach includes employing existing state-of-the-art simulation codes, developing new dynamic simulations, and performing system identification experiments on the hardware in the NASA GRC 10 by10-Foot Supersonic Wind Tunnel. The developed CDMs and CVMs will be available for control studies prior to hardware buildup. The system identification experiments on the CCE Testbed will characterize the necessary dynamics to be represented in CDMs for control design. These system identification models will also be the reference models to validate the CDM and CVM models. Validated models will give value to the tools used to develop the models.

  2. Multi-platform operational validation of the Western Mediterranean SOCIB forecasting system

    NASA Astrophysics Data System (ADS)

    Juza, Mélanie; Mourre, Baptiste; Renault, Lionel; Tintoré, Joaquin

    2014-05-01

    The development of science-based ocean forecasting systems at global, regional, and local scales can support a better management of the marine environment (maritime security, environmental and resources protection, maritime and commercial operations, tourism, ...). In this context, SOCIB (the Balearic Islands Coastal Observing and Forecasting System, www.socib.es) has developed an operational ocean forecasting system in the Western Mediterranean Sea (WMOP). WMOP uses a regional configuration of the Regional Ocean Modelling System (ROMS, Shchepetkin and McWilliams, 2005) nested in the larger scale Mediterranean Forecasting System (MFS) with a spatial resolution of 1.5-2km. WMOP aims at reproducing both the basin-scale ocean circulation and the mesoscale variability which is known to play a crucial role due to its strong interaction with the large scale circulation in this region. An operational validation system has been developed to systematically assess the model outputs at daily, monthly and seasonal time scales. Multi-platform observations are used for this validation, including satellite products (Sea Surface Temperature, Sea Level Anomaly), in situ measurements (from gliders, Argo floats, drifters and fixed moorings) and High-Frequency radar data. The validation procedures allow to monitor and certify the general realism of the daily production of the ocean forecasting system before its distribution to users. Additionally, different indicators (Sea Surface Temperature and Salinity, Eddy Kinetic Energy, Mixed Layer Depth, Heat Content, transports in key sections) are computed every day both at the basin-scale and in several sub-regions (Alboran Sea, Balearic Sea, Gulf of Lion). The daily forecasts, validation diagnostics and indicators from the operational model over the last months are available at www.socib.es.

  3. Antenna gain of actively compensated free-space optical communication systems under strong turbulence conditions.

    PubMed

    Juarez, Juan C; Brown, David M; Young, David W

    2014-05-19

    Current Strehl ratio models for actively compensated free-space optical communications terminals do not accurately predict system performance under strong turbulence conditions as they are based on weak turbulence theory. For evaluation of compensated systems, we present an approach for simulating the Strehl ratio with both low-order (tip/tilt) and higher-order (adaptive optics) correction. Our simulation results are then compared to the published models and their range of turbulence validity is assessed. Finally, we propose a new Strehl ratio model and antenna gain equation that are valid for general turbulence conditions independent of the degree of compensation.

  4. Rethinking modeling framework design: object modeling system 3.0

    USDA-ARS?s Scientific Manuscript database

    The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...

  5. Modeling and experimental validation of a Hybridized Energy Storage System for automotive applications

    NASA Astrophysics Data System (ADS)

    Fiorenti, Simone; Guanetti, Jacopo; Guezennec, Yann; Onori, Simona

    2013-11-01

    This paper presents the development and experimental validation of a dynamic model of a Hybridized Energy Storage System (HESS) consisting of a parallel connection of a lead acid (PbA) battery and double layer capacitors (DLCs), for automotive applications. The dynamic modeling of both the PbA battery and the DLC has been tackled via the equivalent electric circuit based approach. Experimental tests are designed for identification purposes. Parameters of the PbA battery model are identified as a function of state of charge and current direction, whereas parameters of the DLC model are identified for different temperatures. A physical HESS has been assembled at the Center for Automotive Research The Ohio State University and used as a test-bench to validate the model against a typical current profile generated for Start&Stop applications. The HESS model is then integrated into a vehicle simulator to assess the effects of the battery hybridization on the vehicle fuel economy and mitigation of the battery stress.

  6. A predictive model for recurrence in patients with glottic cancer implemented in a mobile application for Android.

    PubMed

    Jover-Esplá, Ana Gabriela; Palazón-Bru, Antonio; Folgado-de la Rosa, David Manuel; Severá-Ferrándiz, Guillermo; Sancho-Mestre, Manuela; de Juan-Herrero, Joaquín; Gil-Guillén, Vicente Francisco

    2018-05-01

    The existing predictive models of laryngeal cancer recurrence present limitations for clinical practice. Therefore, we constructed, internally validated and implemented in a mobile application (Android) a new model based on a points system taking into account the internationally recommended statistical methodology. This longitudinal prospective study included 189 patients with glottic cancer in 2004-2016 in a Spanish region. The main variable was time-to-recurrence, and its potential predictors were: age, gender, TNM classification, stage, smoking, alcohol consumption, and histology. A points system was developed to predict five-year risk of recurrence based on a Cox model. This was validated internally by bootstrapping, determining discrimination (C-statistics) and calibration (smooth curves). A total of 77 patients presented recurrence (40.7%) in a mean follow-up period of 3.4 ± 3.0 years. The factors in the model were: age, lymph node stage, alcohol consumption and stage. Discrimination and calibration were satisfactory. A points system was developed to obtain the probability of recurrence of laryngeal glottic cancer in five years, using five clinical variables. Our system should be validated externally in other geographical areas. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. An Investigation to Advance the Technology Readiness Level of the Centaur Derived On-orbit Propellant Storage and Transfer System

    NASA Astrophysics Data System (ADS)

    Silvernail, Nathan L.

    This research was carried out in collaboration with the United Launch Alliance (ULA), to advance an innovative Centaur-based on-orbit propellant storage and transfer system that takes advantage of rotational settling to simplify Fluid Management (FM), specifically enabling settled fluid transfer between two tanks and settled pressure control. This research consists of two specific objectives: (1) technique and process validation and (2) computational model development. In order to raise the Technology Readiness Level (TRL) of this technology, the corresponding FM techniques and processes must be validated in a series of experimental tests, including: laboratory/ground testing, microgravity flight testing, suborbital flight testing, and orbital testing. Researchers from Embry-Riddle Aeronautical University (ERAU) have joined with the Massachusetts Institute of Technology (MIT) Synchronized Position Hold Engage and Reorient Experimental Satellites (SPHERES) team to develop a prototype FM system for operations aboard the International Space Station (ISS). Testing of the integrated system in a representative environment will raise the FM system to TRL 6. The tests will demonstrate the FM system and provide unique data pertaining to the vehicle's rotational dynamics while undergoing fluid transfer operations. These data sets provide insight into the behavior and physical tendencies of the on-orbit refueling system. Furthermore, they provide a baseline for comparison against the data produced by various computational models; thus verifying the accuracy of the models output and validating the modeling approach. Once these preliminary models have been validated, the parameters defined by them will provide the basis of development for accurate simulations of full scale, on-orbit systems. The completion of this project and the models being developed will accelerate the commercialization of on-orbit propellant storage and transfer technologies as well as all in-space technologies that utilize or will utilize similar FM techniques and processes.

  8. Establishment and validation of the scoring system for preoperative prediction of central lymph node metastasis in papillary thyroid carcinoma.

    PubMed

    Liu, Wen; Cheng, Ruochuan; Ma, Yunhai; Wang, Dan; Su, Yanjun; Diao, Chang; Zhang, Jianming; Qian, Jun; Liu, Jin

    2018-05-03

    Early preoperative diagnosis of central lymph node metastasis (CNM) is crucial to improve survival rates among patients with papillary thyroid carcinoma (PTC). Here, we analyzed clinical data from 2862 PTC patients and developed a scoring system using multivariable logistic regression and testified by the validation group. The predictive diagnostic effectiveness of the scoring system was evaluated based on consistency, discrimination ability, and accuracy. The scoring system considered seven variables: gender, age, tumor size, microcalcification, resistance index >0.7, multiple nodular lesions, and extrathyroid extension. The area under the receiver operating characteristic curve (AUC) was 0.742, indicating a good discrimination. Using 5 points as a diagnostic threshold, the validation results for validation group had an AUC of 0.758, indicating good discrimination and consistency in the scoring system. The sensitivity of this predictive model for preoperative diagnosis of CNM was 4 times higher than a direct ultrasound diagnosis. These data indicate that the CNM prediction model would improve preoperative diagnostic sensitivity for CNM in patients with papillary thyroid carcinoma.

  9. Validation of the DeLone and McLean Information Systems Success Model.

    PubMed

    Ojo, Adebowale I

    2017-01-01

    This study is an adaptation of the widely used DeLone and McLean information system success model in the context of hospital information systems in a developing country. A survey research design was adopted in the study. A structured questionnaire was used to collect data from 442 health information management personnel in five Nigerian teaching hospitals. A structural equation modeling technique was used to validate the model's constructs. It was revealed that system quality significantly influenced use (β = 0.53, p < 0.001) and user satisfaction (β = 0.17, p < 0.001). Information quality significantly influenced use (β = 0.24, p < 0.001) and user satisfaction (β = 0.17, p < 0.001). Also, service quality significantly influenced use (β = 0.22, p < 0.001) and user satisfaction (β = 0.51, p < 0.001). However, use did not significantly influence user satisfaction (β = 0.00, p > 0.05), but it significantly influenced perceived net benefits (β = 0.21, p < 0.001). Furthermore, user satisfaction did not significantly influence perceived net benefits (β = 0.00, p > 0.05). The study validates the DeLone and McLean information system success model in the context of a hospital information system in a developing country. Importantly, system quality and use were found to be important measures of hospital information system success. It is, therefore, imperative that hospital information systems are designed in such ways that are easy to use, flexible, and functional to serve their purpose.

  10. Validation of the DeLone and McLean Information Systems Success Model

    PubMed Central

    2017-01-01

    Objectives This study is an adaptation of the widely used DeLone and McLean information system success model in the context of hospital information systems in a developing country. Methods A survey research design was adopted in the study. A structured questionnaire was used to collect data from 442 health information management personnel in five Nigerian teaching hospitals. A structural equation modeling technique was used to validate the model's constructs. Results It was revealed that system quality significantly influenced use (β = 0.53, p < 0.001) and user satisfaction (β = 0.17, p < 0.001). Information quality significantly influenced use (β = 0.24, p < 0.001) and user satisfaction (β = 0.17, p < 0.001). Also, service quality significantly influenced use (β = 0.22, p < 0.001) and user satisfaction (β = 0.51, p < 0.001). However, use did not significantly influence user satisfaction (β = 0.00, p > 0.05), but it significantly influenced perceived net benefits (β = 0.21, p < 0.001). Furthermore, user satisfaction did not significantly influence perceived net benefits (β = 0.00, p > 0.05). Conclusions The study validates the DeLone and McLean information system success model in the context of a hospital information system in a developing country. Importantly, system quality and use were found to be important measures of hospital information system success. It is, therefore, imperative that hospital information systems are designed in such ways that are easy to use, flexible, and functional to serve their purpose. PMID:28261532

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epiney, A.; Canepa, S.; Zerkak, O.

    The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less

  12. Excavator Design Validation

    NASA Technical Reports Server (NTRS)

    Pholsiri, Chalongrath; English, James; Seberino, Charles; Lim, Yi-Je

    2010-01-01

    The Excavator Design Validation tool verifies excavator designs by automatically generating control systems and modeling their performance in an accurate simulation of their expected environment. Part of this software design includes interfacing with human operations that can be included in simulation-based studies and validation. This is essential for assessing productivity, versatility, and reliability. This software combines automatic control system generation from CAD (computer-aided design) models, rapid validation of complex mechanism designs, and detailed models of the environment including soil, dust, temperature, remote supervision, and communication latency to create a system of high value. Unique algorithms have been created for controlling and simulating complex robotic mechanisms automatically from just a CAD description. These algorithms are implemented as a commercial cross-platform C++ software toolkit that is configurable using the Extensible Markup Language (XML). The algorithms work with virtually any mobile robotic mechanisms using module descriptions that adhere to the XML standard. In addition, high-fidelity, real-time physics-based simulation algorithms have also been developed that include models of internal forces and the forces produced when a mechanism interacts with the outside world. This capability is combined with an innovative organization for simulation algorithms, new regolith simulation methods, and a unique control and study architecture to make powerful tools with the potential to transform the way NASA verifies and compares excavator designs. Energid's Actin software has been leveraged for this design validation. The architecture includes parametric and Monte Carlo studies tailored for validation of excavator designs and their control by remote human operators. It also includes the ability to interface with third-party software and human-input devices. Two types of simulation models have been adapted: high-fidelity discrete element models and fast analytical models. By using the first to establish parameters for the second, a system has been created that can be executed in real time, or faster than real time, on a desktop PC. This allows Monte Carlo simulations to be performed on a computer platform available to all researchers, and it allows human interaction to be included in a real-time simulation process. Metrics on excavator performance are established that work with the simulation architecture. Both static and dynamic metrics are included.

  13. Verification and Validation of COAMPS: Results from a Fully-Coupled Air/Sea/Wave Modeling System

    NASA Astrophysics Data System (ADS)

    Smith, T.; Allard, R. A.; Campbell, T. J.; Chu, Y. P.; Dykes, J.; Zamudio, L.; Chen, S.; Gabersek, S.

    2016-02-01

    The Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) is a state-of-the art, fully-coupled air/sea/wave modeling system that is currently being validated for operational transition to both the Naval Oceanographic Office (NAVO) and to the Fleet Numerical Meteorology and Oceanography Center (FNMOC). COAMPS is run at the Department of Defense Supercomputing Resource Center (DSRC) operated by the DoD High Performance Computing Modernization Program (HPCMP). A total of four models including the Naval Coastal Ocean Model (NCOM), Simulating Waves Nearshore (SWAN), WaveWatch III, and the COAMPS atmospheric model are coupled through both the Earth System Modeling Framework (ESMF). Results from regions of naval operational interests, including the Western Atlantic (U.S. East Coast), RIMPAC (Hawaii), and DYNAMO (Indian Ocean), will show the advantages of utilizing a coupled modeling system versus an uncoupled or stand alone model. Statistical analyses, which include model/observation comparisons, will be presented in the form of operationally approved scorecards for both the atmospheric and oceanic output. Also, computational logistics involving the HPC resources for the COAMPS simulations will be shown.

  14. Evaluation of the DAVROS (Development And Validation of Risk-adjusted Outcomes for Systems of emergency care) risk-adjustment model as a quality indicator for healthcare

    PubMed Central

    Wilson, Richard; Goodacre, Steve W; Klingbajl, Marcin; Kelly, Anne-Maree; Rainer, Tim; Coats, Tim; Holloway, Vikki; Townend, Will; Crane, Steve

    2014-01-01

    Background and objective Risk-adjusted mortality rates can be used as a quality indicator if it is assumed that the discrepancy between predicted and actual mortality can be attributed to the quality of healthcare (ie, the model has attributional validity). The Development And Validation of Risk-adjusted Outcomes for Systems of emergency care (DAVROS) model predicts 7-day mortality in emergency medical admissions. We aimed to test this assumption by evaluating the attributional validity of the DAVROS risk-adjustment model. Methods We selected cases that had the greatest discrepancy between observed mortality and predicted probability of mortality from seven hospitals involved in validation of the DAVROS risk-adjustment model. Reviewers at each hospital assessed hospital records to determine whether the discrepancy between predicted and actual mortality could be explained by the healthcare provided. Results We received 232/280 (83%) completed review forms relating to 179 unexpected deaths and 53 unexpected survivors. The healthcare system was judged to have potentially contributed to 10/179 (8%) of the unexpected deaths and 26/53 (49%) of the unexpected survivors. Failure of the model to appropriately predict risk was judged to be responsible for 135/179 (75%) of the unexpected deaths and 2/53 (4%) of the unexpected survivors. Some 10/53 (19%) of the unexpected survivors died within a few months of the 7-day period of model prediction. Conclusions We found little evidence that deaths occurring in patients with a low predicted mortality from risk-adjustment could be attributed to the quality of healthcare provided. PMID:23605036

  15. Developing and validating a novel metabolic tumor volume risk stratification system for supplementing non-small cell lung cancer staging.

    PubMed

    Pu, Yonglin; Zhang, James X; Liu, Haiyan; Appelbaum, Daniel; Meng, Jianfeng; Penney, Bill C

    2018-06-07

    We hypothesized that whole-body metabolic tumor volume (MTVwb) could be used to supplement non-small cell lung cancer (NSCLC) staging due to its independent prognostic value. The goal of this study was to develop and validate a novel MTVwb risk stratification system to supplement NSCLC staging. We performed an IRB-approved retrospective review of 935 patients with NSCLC and FDG-avid tumor divided into modeling and validation cohorts based on the type of PET/CT scanner used for imaging. In addition, sensitivity analysis was conducted by dividing the patient population into two randomized cohorts. Cox regression and Kaplan-Meier survival analyses were performed to determine the prognostic value of the MTVwb risk stratification system. The cut-off values (10.0, 53.4 and 155.0 mL) between the MTVwb quartiles of the modeling cohort were applied to both the modeling and validation cohorts to determine each patient's MTVwb risk stratum. The survival analyses showed that a lower MTVwb risk stratum was associated with better overall survival (all p < 0.01), independent of TNM stage together with other clinical prognostic factors, and the discriminatory power of the MTVwb risk stratification system, as measured by Gönen and Heller's concordance index, was not significantly different from that of TNM stage in both cohorts. Also, the prognostic value of the MTVwb risk stratum was robust in the two randomized cohorts. The discordance rate between the MTVwb risk stratum and TNM stage or substage was 45.1% in the modeling cohort and 50.3% in the validation cohort. This study developed and validated a novel MTVwb risk stratification system, which has prognostic value independent of the TNM stage and other clinical prognostic factors in NSCLC, suggesting that it could be used for further NSCLC pretreatment assessment and for refining treatment decisions in individual patients.

  16. Certification in Structural Health Monitoring Systems

    DTIC Science & Technology

    2011-09-01

    validation [3,8]. This may be accomplished by computing the sum of squares of pure error ( SSPE ) and its associated squared correlation [3,8]. To compute...these values, a cross- validation sample must be established. In general, if the SSPE is high, the model does not predict well on independent data...plethora of cross- validation methods, some of which are more useful for certain models than others [3,8]. When possible, a disclosure of the SSPE

  17. Design and validation of a questionnaire to evaluate the usability of computerized critical care information systems.

    PubMed

    von Dincklage, Falk; Lichtner, Gregor; Suchodolski, Klaudiusz; Ragaller, Maximilian; Friesdorf, Wolfgang; Podtschaske, Beatrice

    2017-08-01

    The implementation of computerized critical care information systems (CCIS) can improve the quality of clinical care and staff satisfaction, but also holds risks of disrupting the workflow with consecutive negative impacts. The usability of CCIS is one of the key factors determining their benefits and weaknesses. However, no tailored instrument exists to measure the usability of such systems. Therefore, the aim of this study was to design and validate a questionnaire that measures the usability of CCIS. Following a mixed-method design approach, we developed a questionnaire comprising two evaluation models to assess the usability of CCIS: (1) the task-specific model rates the usability individually for several tasks which CCIS could support and which we derived by analyzing work processes in the ICU; (2) the characteristic-specific model rates the different aspects of the usability, as defined by the international standard "ergonomics of human-system interaction". We tested validity and reliability of the digital version of the questionnaire in a sample population. In the sample population of 535 participants both usability evaluation models showed a strong correlation with the overall rating of the system (multiple correlation coefficients ≥0.80) as well as a very high internal consistency (Cronbach's alpha ≥0.93). The novel questionnaire is a valid and reliable instrument to measure the usability of CCIS and can be used to study the influence of the usability on their implementation benefits and weaknesses.

  18. Evaluation of load flow and grid expansion in a unit-commitment and expansion optimization model SciGRID International Conference on Power Grid Modelling

    NASA Astrophysics Data System (ADS)

    Senkpiel, Charlotte; Biener, Wolfgang; Shammugam, Shivenes; Längle, Sven

    2018-02-01

    Energy system models serve as a basis for long term system planning. Joint optimization of electricity generating technologies, storage systems and the electricity grid leads to lower total system cost compared to an approach in which the grid expansion follows a given technology portfolio and their distribution. Modelers often face the problem of finding a good tradeoff between computational time and the level of detail that can be modeled. This paper analyses the differences between a transport model and a DC load flow model to evaluate the validity of using a simple but faster transport model within the system optimization model in terms of system reliability. The main findings in this paper are that a higher regional resolution of a system leads to better results compared to an approach in which regions are clustered as more overloads can be detected. An aggregation of lines between two model regions compared to a line sharp representation has little influence on grid expansion within a system optimizer. In a DC load flow model overloads can be detected in a line sharp case, which is therefore preferred. Overall the regions that need to reinforce the grid are identified within the system optimizer. Finally the paper recommends the usage of a load-flow model to test the validity of the model results.

  19. Bond Graph Modeling and Validation of an Energy Regenerative System for Emulsion Pump Tests

    PubMed Central

    Li, Yilei; Zhu, Zhencai; Chen, Guoan

    2014-01-01

    The test system for emulsion pump is facing serious challenges due to its huge energy consumption and waste nowadays. To settle this energy issue, a novel energy regenerative system (ERS) for emulsion pump tests is briefly introduced at first. Modeling such an ERS of multienergy domains needs a unified and systematic approach. Bond graph modeling is well suited for this task. The bond graph model of this ERS is developed by first considering the separate components before assembling them together and so is the state-space equation. Both numerical simulation and experiments are carried out to validate the bond graph model of this ERS. Moreover the simulation and experiments results show that this ERS not only satisfies the test requirements, but also could save at least 25% of energy consumption as compared to the original test system, demonstrating that it is a promising method of energy regeneration for emulsion pump tests. PMID:24967428

  20. Coupling a regional warning system to a semantic engine on online news for enhancing landslide prediction

    NASA Astrophysics Data System (ADS)

    Battistini, Alessandro; Rosi, Ascanio; Segoni, Samuele; Catani, Filippo; Casagli, Nicola

    2017-04-01

    Landslide inventories are basic data for large scale landslide modelling, e.g. they are needed to calibrate and validate rainfall thresholds, physically based models and early warning systems. The setting up of landslide inventories with traditional methods (e.g. remote sensing, field surveys and manual retrieval of data from technical reports and local newspapers) is time consuming. The objective of this work is to automatically set up a landslide inventory using a state-of-the art semantic engine based on data mining on online news (Battistini et al., 2013) and to evaluate if the automatically generated inventory can be used to validate a regional scale landslide warning system based on rainfall-thresholds. The semantic engine scanned internet news in real time in a 50 months test period. At the end of the process, an inventory of approximately 900 landslides was set up for the Tuscany region (23,000 km2, Italy). The inventory was compared with the outputs of the regional landslide early warning system based on rainfall thresholds, and a good correspondence was found: e.g. 84% of the events reported in the news is correctly identified by the model. In addition, the cases of not correspondence were forwarded to the rainfall threshold developers, which used these inputs to update some of the thresholds. On the basis of the results obtained, we conclude that automatic validation of landslide models using geolocalized landslide events feedback is possible. The source of data for validation can be obtained directly from the internet channel using an appropriate semantic engine. We also automated the validation procedure, which is based on a comparison between forecasts and reported events. We verified that our approach can be automatically used for a near real time validation of the warning system and for a semi-automatic update of the rainfall thresholds, which could lead to an improvement of the forecasting effectiveness of the warning system. In the near future, the proposed procedure could operate in continuous time and could allow for a periodic update of landslide hazard models and landslide early warning systems with minimum human intervention. References: Battistini, A., Segoni, S., Manzo, G., Catani, F., Casagli, N. (2013). Web data mining for automatic inventory of geohazards at national scale. Applied Geography, 43, 147-158.

  1. The Integrated Airframe/Propulsion Control System Architecture program (IAPSA)

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Cohen, Gerald C.; Meissner, Charles W.

    1990-01-01

    The Integrated Airframe/Propulsion Control System Architecture program (IAPSA) is a two-phase program which was initiated by NASA in the early 80s. The first phase, IAPSA 1, studied different architectural approaches to the problem of integrating engine control systems with airframe control systems in an advanced tactical fighter. One of the conclusions of IAPSA 1 was that the technology to construct a suitable system was available, yet the ability to create these complex computer architectures has outpaced the ability to analyze the resulting system's performance. With this in mind, the second phase of IAPSA approached the same problem with the added constraint that the system be designed for validation. The intent of the design for validation requirement is that validation requirements should be shown to be achievable early in the design process. IAPSA 2 has demonstrated that despite diligent efforts, integrated systems can retain characteristics which are difficult to model and, therefore, difficult to validate.

  2. Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    McCrink, Matthew Henry

    This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.

  3. Recent developments of DMI's operational system: Coupled Ecosystem-Circulation-and SPM model.

    NASA Astrophysics Data System (ADS)

    Murawski, Jens; Tian, Tian; Dobrynin, Mikhail

    2010-05-01

    ECOOP is a pan- European project with 72 partners from 29 countries around the Baltic Sea, the North Sea, the Iberia-Biscay-Ireland region, the Mediterranean Sea and the Black Sea. The project aims at the development and the integration of the different coastal and regional observation and forecasting systems. The Danish Meteorological Institute DMI coordinates the project and is responsible for the Baltic Sea regional forecasting System. Over the project period, the Baltic Sea system was developed from a purely hydro dynamical model (version V1), running operationally since summer 2009, to a coupled model platform (version V2), including model components for the simulation of suspended particles, data assimilation and ecosystem variables. The ECOOP V2 model is currently tested and validated, and will replace the V1 version soon. The coupled biogeochemical- and circulation model runs operationally since November 2009. The daily forecasts are presented at DMI's homepage http:/ocean.dmi.dk. The presentation includes a short description of the ECOOP forecasting system, discusses the model results and shows the outcome of the model validation.

  4. Prediction of thermal behaviors of an air-cooled lithium-ion battery system for hybrid electric vehicles

    NASA Astrophysics Data System (ADS)

    Choi, Yong Seok; Kang, Dal Mo

    2014-12-01

    Thermal management has been one of the major issues in developing a lithium-ion (Li-ion) hybrid electric vehicle (HEV) battery system since the Li-ion battery is vulnerable to excessive heat load under abnormal or severe operational conditions. In this work, in order to design a suitable thermal management system, a simple modeling methodology describing thermal behavior of an air-cooled Li-ion battery system was proposed from vehicle components designer's point of view. A proposed mathematical model was constructed based on the battery's electrical and mechanical properties. Also, validation test results for the Li-ion battery system were presented. A pulse current duty and an adjusted US06 current cycle for a two-mode HEV system were used to validate the accuracy of the model prediction. Results showed that the present model can give good estimations for simulating convective heat transfer cooling during battery operation. The developed thermal model is useful in structuring the flow system and determining the appropriate cooling capacity for a specified design prerequisite of the battery system.

  5. Experimental Validation of a Closed Brayton Cycle System Transient Simulation

    NASA Technical Reports Server (NTRS)

    Johnson, Paul K.; Hervol, David S.

    2006-01-01

    The Brayton Power Conversion Unit (BPCU) is a closed cycle system with an inert gas working fluid. It is located in Vacuum Facility 6 at NASA Glenn Research Center. Was used in previous solar dynamic technology efforts (SDGTD). Modified to its present configuration by replacing the solar receiver with an electrical resistance heater. The first closed-Brayton-cycle to be coupled with an ion propulsion system. Used to examine mechanical dynamic characteristics and responses. The focus of this work was the validation of a computer model of the BPCU. Model was built using the Closed Cycle System Simulation (CCSS) design and analysis tool. Test conditions were then duplicated in CCSS. Various steady-state points. Transients involving changes in shaft rotational speed and heat input. Testing to date has shown that the BPCU is able to generate meaningful, repeatable data that can be used for computer model validation. Results generated by CCSS demonstrated that the model sufficiently reproduced the thermal transients exhibited by the BPCU system. CCSS was also used to match BPCU steady-state operating points. Cycle temperatures were within 4.1% of the data (most were within 1%). Cycle pressures were all within 3.2%. Error in alternator power (as much as 13.5%) was attributed to uncertainties in the compressor and turbine maps and alternator and bearing loss models. The acquired understanding of the BPCU behavior gives useful insight for improvements to be made to the CCSS model as well as ideas for future testing and possible system modifications.

  6. Moving alcohol prevention research forward-Part II: new directions grounded in community-based system dynamics modeling.

    PubMed

    Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller

    2018-02-01

    Given the complexity of factors contributing to alcohol misuse, appropriate epistemologies and methodologies are needed to understand and intervene meaningfully. We aimed to (1) provide an overview of computational modeling methodologies, with an emphasis on system dynamics modeling; (2) explain how community-based system dynamics modeling can forge new directions in alcohol prevention research; and (3) present a primer on how to build alcohol misuse simulation models using system dynamics modeling, with an emphasis on stakeholder involvement, data sources and model validation. Throughout, we use alcohol misuse among college students in the United States as a heuristic example for demonstrating these methodologies. System dynamics modeling employs a top-down aggregate approach to understanding dynamically complex problems. Its three foundational properties-stocks, flows and feedbacks-capture non-linearity, time-delayed effects and other system characteristics. As a methodological choice, system dynamics modeling is amenable to participatory approaches; in particular, community-based system dynamics modeling has been used to build impactful models for addressing dynamically complex problems. The process of community-based system dynamics modeling consists of numerous stages: (1) creating model boundary charts, behavior-over-time-graphs and preliminary system dynamics models using group model-building techniques; (2) model formulation; (3) model calibration; (4) model testing and validation; and (5) model simulation using learning-laboratory techniques. Community-based system dynamics modeling can provide powerful tools for policy and intervention decisions that can result ultimately in sustainable changes in research and action in alcohol misuse prevention. © 2017 Society for the Study of Addiction.

  7. Injector Design Tool Improvements: User's manual for FDNS V.4.5

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Shang, Huan-Min; Wei, Hong; Liu, Jiwen

    1998-01-01

    The major emphasis of the current effort is in the development and validation of an efficient parallel machine computational model, based on the FDNS code, to analyze the fluid dynamics of a wide variety of liquid jet configurations for general liquid rocket engine injection system applications. This model includes physical models for droplet atomization, breakup/coalescence, evaporation, turbulence mixing and gas-phase combustion. Benchmark validation cases for liquid rocket engine chamber combustion conditions will be performed for model validation purpose. Test cases may include shear coaxial, swirl coaxial and impinging injection systems with combinations LOXIH2 or LOXISP-1 propellant injector elements used in rocket engine designs. As a final goal of this project, a well tested parallel CFD performance methodology together with a user's operation description in a final technical report will be reported at the end of the proposed research effort.

  8. Improvement of web-based data acquisition and management system for GOSAT validation lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra Nugraha; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2013-01-01

    A web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data-analysis has been developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS (Automated Meteorological Data Acquisition System) ground-level local meteorological data, GPS Radiosonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data. In this article, we briefly describe some improvement for higher performance and higher data usability. GPS Radiosonde upper-air meteorological data and U.S. standard atmospheric model in DAS automatically calculate molecule number density profiles. Predicted ozone density prole images above Saga city are also calculated by using Meteorological Research Institute (MRI) chemistry-climate model version 2 for comparison to actual ozone DIAL data.

  9. Towards a Consolidated Approach for the Assessment of Evaluation Models of Nuclear Power Reactors

    DOE PAGES

    Epiney, A.; Canepa, S.; Zerkak, O.; ...

    2016-11-02

    The STARS project at the Paul Scherrer Institut (PSI) has adopted the TRACE thermal-hydraulic (T-H) code for best-estimate system transient simulations of the Swiss Light Water Reactors (LWRs). For analyses involving interactions between system and core, a coupling of TRACE with the SIMULATE-3K (S3K) LWR core simulator has also been developed. In this configuration, the TRACE code and associated nuclear power reactor simulation models play a central role to achieve a comprehensive safety analysis capability. Thus, efforts have now been undertaken to consolidate the validation strategy by implementing a more rigorous and structured assessment approach for TRACE applications involving eithermore » only system T-H evaluations or requiring interfaces to e.g. detailed core or fuel behavior models. The first part of this paper presents the preliminary concepts of this validation strategy. The principle is to systematically track the evolution of a given set of predicted physical Quantities of Interest (QoIs) over a multidimensional parametric space where each of the dimensions represent the evolution of specific analysis aspects, including e.g. code version, transient specific simulation methodology and model "nodalisation". If properly set up, such environment should provide code developers and code users with persistent (less affected by user effect) and quantified information (sensitivity of QoIs) on the applicability of a simulation scheme (codes, input models, methodology) for steady state and transient analysis of full LWR systems. Through this, for each given transient/accident, critical paths of the validation process can be identified that could then translate into defining reference schemes to be applied for downstream predictive simulations. In order to illustrate this approach, the second part of this paper presents a first application of this validation strategy to an inadvertent blowdown event that occurred in a Swiss BWR/6. The transient was initiated by the spurious actuation of the Automatic Depressurization System (ADS). The validation approach progresses through a number of dimensions here: First, the same BWR system simulation model is assessed for different versions of the TRACE code, up to the most recent one. The second dimension is the "nodalisation" dimension, where changes to the input model are assessed. The third dimension is the "methodology" dimension. In this case imposed power and an updated TRACE core model are investigated. For each step in each validation dimension, a common set of QoIs are investigated. For the steady-state results, these include fuel temperatures distributions. For the transient part of the present study, the evaluated QoIs include the system pressure evolution and water carry-over into the steam line.« less

  10. Ensemble assimilation of ARGO temperature profile, sea surface temperature, and altimetric satellite data into an eddy permitting primitive equation model of the North Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Yan, Y.; Barth, A.; Beckers, J. M.; Candille, G.; Brankart, J. M.; Brasseur, P.

    2015-07-01

    Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. An incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semiindependent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analyzed jointly. The consistency and complementarity between both validations are highlighted.

  11. Modeling validation and control analysis for controlled temperature and humidity of air conditioning system.

    PubMed

    Lee, Jing-Nang; Lin, Tsung-Min; Chen, Chien-Chih

    2014-01-01

    This study constructs an energy based model of thermal system for controlled temperature and humidity air conditioning system, and introduces the influence of the mass flow rate, heater and humidifier for proposed control criteria to achieve the controlled temperature and humidity of air conditioning system. Then, the reliability of proposed thermal system model is established by both MATLAB dynamic simulation and the literature validation. Finally, the PID control strategy is applied for controlling the air mass flow rate, humidifying capacity, and heating, capacity. The simulation results show that the temperature and humidity are stable at 541 sec, the disturbance of temperature is only 0.14 °C, 0006 kg(w)/kg(da) in steady-state error of humidity ratio, and the error rate is only 7.5%. The results prove that the proposed system is an effective controlled temperature and humidity of an air conditioning system.

  12. Modeling Validation and Control Analysis for Controlled Temperature and Humidity of Air Conditioning System

    PubMed Central

    Lee, Jing-Nang; Lin, Tsung-Min

    2014-01-01

    This study constructs an energy based model of thermal system for controlled temperature and humidity air conditioning system, and introduces the influence of the mass flow rate, heater and humidifier for proposed control criteria to achieve the controlled temperature and humidity of air conditioning system. Then, the reliability of proposed thermal system model is established by both MATLAB dynamic simulation and the literature validation. Finally, the PID control strategy is applied for controlling the air mass flow rate, humidifying capacity, and heating, capacity. The simulation results show that the temperature and humidity are stable at 541 sec, the disturbance of temperature is only 0.14°C, 0006 kgw/kgda in steady-state error of humidity ratio, and the error rate is only 7.5%. The results prove that the proposed system is an effective controlled temperature and humidity of an air conditioning system. PMID:25250390

  13. The Politics and Statistics of Value-Added Modeling for Accountability of Teacher Preparation Programs

    ERIC Educational Resources Information Center

    Lincove, Jane Arnold; Osborne, Cynthia; Dillon, Amanda; Mills, Nicholas

    2014-01-01

    Despite questions about validity and reliability, the use of value-added estimation methods has moved beyond academic research into state accountability systems for teachers, schools, and teacher preparation programs (TPPs). Prior studies of value-added measurement for TPPs test the validity of researcher-designed models and find that measuring…

  14. A new extranodal scoring system based on the prognostically relevant extranodal sites in diffuse large B-cell lymphoma, not otherwise specified treated with chemoimmunotherapy.

    PubMed

    Hwang, Hee Sang; Yoon, Dok Hyun; Suh, Cheolwon; Huh, Jooryung

    2016-08-01

    Extranodal involvement is a well-known prognostic factor in patients with diffuse large B-cell lymphomas (DLBCL). Nevertheless, the prognostic impact of the extranodal scoring system included in the conventional international prognostic index (IPI) has been questioned in an era where rituximab treatment has become widespread. We investigated the prognostic impacts of individual sites of extranodal involvement in 761 patients with DLBCL who received rituximab-based chemoimmunotherapy. Subsequently, we established a new extranodal scoring system based on extranodal sites, showing significant prognostic correlation, and compared this system with conventional scoring systems, such as the IPI and the National Comprehensive Cancer Network-IPI (NCCN-IPI). An internal validation procedure, using bootstrapped samples, was also performed for both univariate and multivariate models. Using multivariate analysis with a backward variable selection, we found nine extranodal sites (the liver, lung, spleen, central nervous system, bone marrow, kidney, skin, adrenal glands, and peritoneum) that remained significant for use in the final model. Our newly established extranodal scoring system, based on these sites, was better correlated with patient survival than standard scoring systems, such as the IPI and the NCCN-IPI. Internal validation by bootstrapping demonstrated an improvement in model performance of our modified extranodal scoring system. Our new extranodal scoring system, based on the prognostically relevant sites, may improve the performance of conventional prognostic models of DLBCL in the rituximab era and warrants further external validation using large study populations.

  15. Validation of BEHAVE fire behavior predictions in oak savannas using five fuel models

    Treesearch

    Keith Grabner; John Dwyer; Bruce Cutter

    1997-01-01

    Prescribed fire is a valuable tool in the restoration and management of oak savannas. BEHAVE, a fire behavior prediction system developed by the United States Forest Service, can be a useful tool when managing oak savannas with prescribed fire. BEHAVE predictions of fire rate-of-spread and flame length were validated using four standardized fuel models: Fuel Model 1 (...

  16. FAST Model Calibration and Validation of the OC5- DeepCwind Floating Offshore Wind System Against Wave Tank Test Data: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Robertson, Amy N; Jonkman, Jason

    During the course of the Offshore Code Comparison Collaboration, Continued, with Correlation (OC5) project, which focused on the validation of numerical methods through comparison against tank test data, the authors created a numerical FAST model of the 1:50-scale DeepCwind semisubmersible system that was tested at the Maritime Research Institute Netherlands ocean basin in 2013. This paper discusses several model calibration studies that were conducted to identify model adjustments that improve the agreement between the numerical simulations and the experimental test data. These calibration studies cover wind-field-specific parameters (coherence, turbulence), hydrodynamic and aerodynamic modeling approaches, as well as rotor model (blade-pitchmore » and blade-mass imbalances) and tower model (structural tower damping coefficient) adjustments. These calibration studies were conducted based on relatively simple calibration load cases (wave only/wind only). The agreement between the final FAST model and experimental measurements is then assessed based on more-complex combined wind and wave validation cases.« less

  17. IMPLEMENTATION AND VALIDATION OF A FULLY IMPLICIT ACCUMULATOR MODEL IN RELAP-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Haihua; Zou, Ling; Zhang, Hongbin

    2016-01-01

    This paper presents the implementation and validation of an accumulator model in RELAP-7 under the framework of preconditioned Jacobian free Newton Krylov (JFNK) method, based on the similar model used in RELAP5. RELAP-7 is a new nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). RELAP-7 is a fully implicit system code. The JFNK and preconditioning methods used in RELAP-7 is briefly discussed. The slightly modified accumulator model is summarized for completeness. The implemented model was validated with LOFT L3-1 test and benchmarked with RELAP5 results. RELAP-7 and RELAP5 had almost identical results for themore » accumulator gas pressure and water level, although there were some minor difference in other parameters such as accumulator gas temperature and tank wall temperature. One advantage of the JFNK method is its easiness to maintain and modify models due to fully separation of numerical methods from physical models. It would be straightforward to extend the current RELAP-7 accumulator model to simulate the advanced accumulator design.« less

  18. Nonlinear system identification of smart structures under high impact loads

    NASA Astrophysics Data System (ADS)

    Sarp Arsava, Kemal; Kim, Yeesock; El-Korchi, Tahar; Park, Hyo Seon

    2013-05-01

    The main purpose of this paper is to develop numerical models for the prediction and analysis of the highly nonlinear behavior of integrated structure control systems subjected to high impact loading. A time-delayed adaptive neuro-fuzzy inference system (TANFIS) is proposed for modeling of the complex nonlinear behavior of smart structures equipped with magnetorheological (MR) dampers under high impact forces. Experimental studies are performed to generate sets of input and output data for training and validation of the TANFIS models. The high impact load and current signals are used as the input disturbance and control signals while the displacement and acceleration responses from the structure-MR damper system are used as the output signals. The benchmark adaptive neuro-fuzzy inference system (ANFIS) is used as a baseline. Comparisons of the trained TANFIS models with experimental results demonstrate that the TANFIS modeling framework is an effective way to capture nonlinear behavior of integrated structure-MR damper systems under high impact loading. In addition, the performance of the TANFIS model is much better than that of ANFIS in both the training and the validation processes.

  19. Examining construct and predictive validity of the Health-IT Usability Evaluation Scale: confirmatory factor analysis and structural equation modeling results.

    PubMed

    Yen, Po-Yin; Sousa, Karen H; Bakken, Suzanne

    2014-10-01

    In a previous study, we developed the Health Information Technology Usability Evaluation Scale (Health-ITUES), which is designed to support customization at the item level. Such customization matches the specific tasks/expectations of a health IT system while retaining comparability at the construct level, and provides evidence of its factorial validity and internal consistency reliability through exploratory factor analysis. In this study, we advanced the development of Health-ITUES to examine its construct validity and predictive validity. The health IT system studied was a web-based communication system that supported nurse staffing and scheduling. Using Health-ITUES, we conducted a cross-sectional study to evaluate users' perception toward the web-based communication system after system implementation. We examined Health-ITUES's construct validity through first and second order confirmatory factor analysis (CFA), and its predictive validity via structural equation modeling (SEM). The sample comprised 541 staff nurses in two healthcare organizations. The CFA (n=165) showed that a general usability factor accounted for 78.1%, 93.4%, 51.0%, and 39.9% of the explained variance in 'Quality of Work Life', 'Perceived Usefulness', 'Perceived Ease of Use', and 'User Control', respectively. The SEM (n=541) supported the predictive validity of Health-ITUES, explaining 64% of the variance in intention for system use. The results of CFA and SEM provide additional evidence for the construct and predictive validity of Health-ITUES. The customizability of Health-ITUES has the potential to support comparisons at the construct level, while allowing variation at the item level. We also illustrate application of Health-ITUES across stages of system development. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  20. Validation Of The Airspace Concept Evaluation System Using Real World Data

    NASA Technical Reports Server (NTRS)

    Zelinski, Shannon

    2005-01-01

    This paper discusses the process of performing a validation of the Airspace Concept Evaluation System (ACES) using real world historical flight operational data. ACES inputs are generated from select real world data and processed to create a realistic reproduction of a single day of operations within the National Airspace System (NAS). ACES outputs are then compared to real world operational metrics and delay statistics for the reproduced day. Preliminary results indicate that ACES produces delays and airport operational metrics similar to the real world with minor variations of delay by phase of flight. ACES is a nation-wide fast-time simulation tool developed at NASA Ames Research Center. ACES models and simulates the NAS using interacting agents representing center control, terminal flow management, airports, individual flights, and other NAS elements. These agents pass messages between one another similar to real world communications. This distributed agent based system is designed to emulate the highly unpredictable nature of the NAS, making it a suitable tool to evaluate current and envisioned airspace concepts. To ensure that ACES produces the most realistic results, the system must be validated. There is no way to validate future concepts scenarios using real world historical data, but current day scenario validations increase confidence in the validity of future scenario results. Each operational day has unique weather and traffic demand schedules. The more a simulation utilizes the unique characteristic of a specific day, the more realistic the results should be. ACES is able to simulate the full scale demand traffic necessary to perform a validation using real world data. Through direct comparison with the real world, models may continuee to be improved and unusual trends and biases may be filtered out of the system or used to normalize the results of future concept simulations.

  1. In-line pressure-flow module for in vitro modelling of haemodynamics and biosensor validation

    NASA Technical Reports Server (NTRS)

    Koenig, S. C.; Schaub, J. D.; Ewert, D. L.; Swope, R. D.; Convertino, V. A. (Principal Investigator)

    1997-01-01

    An in-line pressure-flow module for in vitro modelling of haemodynamics and biosensor validation has been developed. Studies show that good accuracy can be achieved in the measurement of pressure and of flow, in steady and pulstile flow systems. The model can be used for development, testing and evaluation of cardiovascular-mechanical-electrical anlogue models, cardiovascular prosthetics (i.e. valves, vascular grafts) and pressure and flow biosensors.

  2. Parameter Selection Methods in Inverse Problem Formulation

    DTIC Science & Technology

    2010-11-03

    clinical data and used for prediction and a model for the reaction of the cardiovascular system to an ergometric workload. Key Words: Parameter selection...model for HIV dynamics which has been successfully validated with clinical data and used for prediction and a model for the reaction of the...recently developed in-host model for HIV dynamics which has been successfully validated with clinical data and used for prediction [4, 8]; b) a global

  3. Demonstrating the Alaska Ocean Observing System in Prince William Sound

    NASA Astrophysics Data System (ADS)

    Schoch, G. Carl; McCammon, Molly

    2013-07-01

    The Alaska Ocean Observing System and the Oil Spill Recovery Institute developed a demonstration project over a 5 year period in Prince William Sound. The primary goal was to develop a quasi-operational system that delivers weather and ocean information in near real time to diverse user communities. This observing system now consists of atmospheric and oceanic sensors, and a new generation of computer models to numerically simulate and forecast weather, waves, and ocean circulation. A state of the art data management system provides access to these products from one internet portal at http://www.aoos.org. The project culminated in a 2009 field experiment that evaluated the observing system and performance of the model forecasts. Observations from terrestrial weather stations and weather buoys validated atmospheric circulation forecasts. Observations from wave gages on weather buoys validated forecasts of significant wave heights and periods. There was an emphasis on validation of surface currents forecasted by the ocean circulation model for oil spill response and search and rescue applications. During the 18 day field experiment a radar array mapped surface currents and drifting buoys were deployed. Hydrographic profiles at fixed stations, and by autonomous vehicles along transects, were made to acquire measurements through the water column. Terrestrial weather stations were the most reliable and least costly to operate, and in situ ocean sensors were more costly and considerably less reliable. The radar surface current mappers were the least reliable and most costly but provided the assimilation and validation data that most improved ocean circulation forecasts. We describe the setting of Prince William Sound and the various observational platforms and forecast models of the observing system, and discuss recommendations for future development.

  4. Calibration and validation of an activated sludge model for greenhouse gases no. 1 (ASMG1): prediction of temperature-dependent N₂O emission dynamics.

    PubMed

    Guo, Lisha; Vanrolleghem, Peter A

    2014-02-01

    An activated sludge model for greenhouse gases no. 1 was calibrated with data from a wastewater treatment plant (WWTP) without control systems and validated with data from three similar plants equipped with control systems. Special about the calibration/validation approach adopted in this paper is that the data are obtained from simulations with a mathematical model that is widely accepted to describe effluent quality and operating costs of actual WWTPs, the Benchmark Simulation Model No. 2 (BSM2). The calibration also aimed at fitting the model to typical observed nitrous oxide (N₂O) emission data, i.e., a yearly average of 0.5% of the influent total nitrogen load emitted as N₂O-N. Model validation was performed by challenging the model in configurations with different control strategies. The kinetic term describing the dissolved oxygen effect on the denitrification by ammonia-oxidizing bacteria (AOB) was modified into a Haldane term. Both original and Haldane-modified models passed calibration and validation. Even though their yearly averaged values were similar, the two models presented different dynamic N₂O emissions under cold temperature conditions and control. Therefore, data collected in such situations can potentially permit model discrimination. Observed seasonal trends in N₂O emissions are simulated well with both original and Haldane-modified models. A mechanistic explanation based on the temperature-dependent interaction between heterotrophic and autotrophic N₂O pathways was provided. Finally, while adding the AOB denitrification pathway to a model with only heterotrophic N₂O production showed little impact on effluent quality and operating cost criteria, it clearly affected N2O emission productions.

  5. 42 CFR § 414.1390 - Data validation and auditing.

    Code of Federal Regulations, 2010 CFR

    2017-10-01

    ... SERVICES (CONTINUED) MEDICARE PROGRAM (CONTINUED) PAYMENT FOR PART B MEDICAL AND OTHER HEALTH SERVICES Merit-Based Incentive Payment System and Alternative Payment Model Incentive § 414.1390 Data validation...

  6. Modelling dimercaptosuccinic acid (DMSA) plasma kinetics in humans.

    PubMed

    van Eijkeren, Jan C H; Olie, J Daniël N; Bradberry, Sally M; Vale, J Allister; de Vries, Irma; Meulenbelt, Jan; Hunault, Claudine C

    2016-11-01

    No kinetic models presently exist which simulate the effect of chelation therapy on lead blood concentrations in lead poisoning. Our aim was to develop a kinetic model that describes the kinetics of dimercaptosuccinic acid (DMSA; succimer), a commonly used chelating agent, that could be used in developing a lead chelating model. This was a kinetic modelling study. We used a two-compartment model, with a non-systemic gastrointestinal compartment (gut lumen) and the whole body as one systemic compartment. The only data available from the literature were used to calibrate the unknown model parameters. The calibrated model was then validated by comparing its predictions with measured data from three different experimental human studies. The model predicted total DMSA plasma and urine concentrations measured in three healthy volunteers after ingestion of DMSA 10 mg/kg. The model was then validated by using data from three other published studies; it predicted concentrations within a factor of two, representing inter-human variability. A simple kinetic model simulating the kinetics of DMSA in humans has been developed and validated. The interest of this model lies in the future potential to use it to predict blood lead concentrations in lead-poisoned patients treated with DMSA.

  7. Commercial Supersonics Technology Project - Status of Airport Noise

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2016-01-01

    The Commercial Supersonic Technology Project has been developing databases, computational tools, and system models to prepare for a level 1 milestone, the Low Noise Propulsion Tech Challenge, to be delivered Sept 2016. Steps taken to prepare for the final validation test are given, including system analysis, code validation, and risk reduction testing.

  8. XML and Bibliographic Data: The TVS (Transport, Validation and Services) Model.

    ERIC Educational Resources Information Center

    de Carvalho, Joaquim; Cordeiro, Maria Ines

    This paper discusses the role of XML in library information systems at three major levels: as are presentation language that enables the transport of bibliographic data in a way that is technologically independent and universally understood across systems and domains; as a language that enables the specification of complex validation rules…

  9. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  10. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  11. TH-AB-BRA-07: PENELOPE-Based GPU-Accelerated Dose Calculation System Applied to MRI-Guided Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y; Mazur, T; Green, O

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: We first translated PENELOPE from FORTRAN to C++ and validated that the translation produced equivalent results. Then we adapted the C++ code to CUDA in a workflow optimized for GPU architecture. We expanded upon the original code to include voxelized transportmore » boosted by Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, we incorporated the vendor-provided MRIdian head model into the code. We performed a set of experimental measurements on MRIdian to examine the accuracy of both the head model and gPENELOPE, and then applied gPENELOPE toward independent validation of patient doses calculated by MRIdian’s KMC. Results: We achieve an average acceleration factor of 152 compared to the original single-thread FORTRAN implementation with the original accuracy preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen (1), mediastinum (1) and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: We developed a Monte Carlo simulation platform based on a GPU-accelerated version of PENELOPE. We validated that both the vendor provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  12. Empirical evaluation of decision support systems: Needs, definitions, potential methods, and an example pertaining to waterfowl management

    USGS Publications Warehouse

    Sojda, R.S.

    2007-01-01

    Decision support systems are often not empirically evaluated, especially the underlying modelling components. This can be attributed to such systems necessarily being designed to handle complex and poorly structured problems and decision making. Nonetheless, evaluation is critical and should be focused on empirical testing whenever possible. Verification and validation, in combination, comprise such evaluation. Verification is ensuring that the system is internally complete, coherent, and logical from a modelling and programming perspective. Validation is examining whether the system is realistic and useful to the user or decision maker, and should answer the question: “Was the system successful at addressing its intended purpose?” A rich literature exists on verification and validation of expert systems and other artificial intelligence methods; however, no single evaluation methodology has emerged as preeminent. At least five approaches to validation are feasible. First, under some conditions, decision support system performance can be tested against a preselected gold standard. Second, real-time and historic data sets can be used for comparison with simulated output. Third, panels of experts can be judiciously used, but often are not an option in some ecological domains. Fourth, sensitivity analysis of system outputs in relation to inputs can be informative. Fifth, when validation of a complete system is impossible, examining major components can be substituted, recognizing the potential pitfalls. I provide an example of evaluation of a decision support system for trumpeter swan (Cygnus buccinator) management that I developed using interacting intelligent agents, expert systems, and a queuing system. Predicted swan distributions over a 13-year period were assessed against observed numbers. Population survey numbers and banding (ringing) studies may provide long term data useful in empirical evaluation of decision support.

  13. Information system end-user satisfaction and continuance intention: A unified modeling approach.

    PubMed

    Hadji, Brahim; Degoulet, Patrice

    2016-06-01

    Permanent evaluation of end-user satisfaction and continuance intention is a critical issue at each phase of a clinical information system (CIS) project, but most validation studies are concerned with the pre- or early post-adoption phases. The purpose of this study was twofold: to validate at the Pompidou University Hospital (HEGP) an information technology late post-adoption model built from four validated models and to propose a unified metamodel of evaluation that could be adapted to each context or deployment phase of a CIS project. Five dimensions, i.e., CIS quality (CISQ), perceived usefulness (PU), confirmation of expectations (CE), user satisfaction (SAT), and continuance intention (CI) were selected to constitute the CI evaluation model. The validity of the model was tested using the combined answers to four surveys performed between 2011 and 2015, i.e., more than ten years after the opening of HEGP in July 2000. Structural equation modeling was used to test the eight model-associated hypotheses. The multi-professional study group of 571 responders consisted of 158 doctors, 282 nurses, and 131 secretaries. The evaluation model accounted for 84% of variance of satisfaction and 53% of CI variance for the period 2011-2015 and for 92% and 69% for the period 2014-2015. In very late post adoption, CISQ appears to be the major determinant of satisfaction and CI. Combining the results obtained at various phases of CIS deployment, a Unified Model of Information System Continuance (UMISC) is proposed. In a meaningful CIS use situation at HEGP, this study confirms the importance of CISQ in explaining satisfaction and CI. The proposed UMISC model that can be adapted to each phase of CIS deployment could facilitate the necessary efforts of permanent CIS acceptance and continuance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Development and validation of a mass casualty conceptual model.

    PubMed

    Culley, Joan M; Effken, Judith A

    2010-03-01

    To develop and validate a conceptual model that provides a framework for the development and evaluation of information systems for mass casualty events. The model was designed based on extant literature and existing theoretical models. A purposeful sample of 18 experts validated the model. Open-ended questions, as well as a 7-point Likert scale, were used to measure expert consensus on the importance of each construct and its relationship in the model and the usefulness of the model to future research. Computer-mediated applications were used to facilitate a modified Delphi technique through which a panel of experts provided validation for the conceptual model. Rounds of questions continued until consensus was reached, as measured by an interquartile range (no more than 1 scale point for each item); stability (change in the distribution of responses less than 15% between rounds); and percent agreement (70% or greater) for indicator questions. Two rounds of the Delphi process were needed to satisfy the criteria for consensus or stability related to the constructs, relationships, and indicators in the model. The panel reached consensus or sufficient stability to retain all 10 constructs, 9 relationships, and 39 of 44 indicators. Experts viewed the model as useful (mean of 5.3 on a 7-point scale). Validation of the model provides the first step in understanding the context in which mass casualty events take place and identifying variables that impact outcomes of care. This study provides a foundation for understanding the complexity of mass casualty care, the roles that nurses play in mass casualty events, and factors that must be considered in designing and evaluating information-communication systems to support effective triage under these conditions.

  15. Validation of Computational Models in Biomechanics

    PubMed Central

    Henninger, Heath B.; Reese, Shawn P.; Anderson, Andrew E.; Weiss, Jeffrey A.

    2010-01-01

    The topics of verification and validation (V&V) have increasingly been discussed in the field of computational biomechanics, and many recent articles have applied these concepts in an attempt to build credibility for models of complex biological systems. V&V are evolving techniques that, if used improperly, can lead to false conclusions about a system under study. In basic science these erroneous conclusions may lead to failure of a subsequent hypothesis, but they can have more profound effects if the model is designed to predict patient outcomes. While several authors have reviewed V&V as they pertain to traditional solid and fluid mechanics, it is the intent of this manuscript to present them in the context of computational biomechanics. Specifically, the task of model validation will be discussed with a focus on current techniques. It is hoped that this review will encourage investigators to engage and adopt the V&V process in an effort to increase peer acceptance of computational biomechanics models. PMID:20839648

  16. Modeling Piezoelectric Stack Actuators for Control of Micromanipulation

    NASA Technical Reports Server (NTRS)

    Goldfarb, Michael; Celanovic, Nikola

    1997-01-01

    A nonlinear lumped-parameter model of a piezoelectric stack actuator has been developed to describe actuator behavior for purposes of control system analysis and design, and, in particular, for microrobotic applications requiring accurate position and/or force control. In formulating this model, the authors propose a generalized Maxwell resistive capacitor as a lumped-parameter causal representation of rate-independent hysteresis. Model formulation is validated by comparing results of numerical simulations to experimental data. Validation is followed by a discussion of model implications for purposes of actuator control.

  17. Model Validation Status Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E.L. Hardin

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified,more » and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and engineered barriers, plus the TSPA model itself Description of the model areas is provided in Section 3, and the documents reviewed are described in Section 4. The responsible manager for the Model Validation Status Review was the Chief Science Officer (CSO) for Bechtel-SAIC Co. (BSC). The team lead was assigned by the CSO. A total of 32 technical specialists were engaged to evaluate model validation status in the 21 model areas. The technical specialists were generally independent of the work reviewed, meeting technical qualifications as discussed in Section 5.« less

  18. Description and Validation of a Dynamical Systems Model of Presynaptic Serotonin Function: Genetic Variation, Brain Activation and Impulsivity

    PubMed Central

    Stoltenberg, Scott F.; Nag, Parthasarathi

    2010-01-01

    Despite more than a decade of empirical work on the role of genetic polymorphisms in the serotonin system on behavior, the details across levels of analysis are not well understood. We describe a mathematical model of the genetic control of presynaptic serotonergic function that is based on control theory, implemented using systems of differential equations, and focused on better characterizing pathways from genes to behavior. We present the results of model validation tests that include the comparison of simulation outcomes with empirical data on genetic effects on brain response to affective stimuli and on impulsivity. Patterns of simulated neural firing were consistent with recent findings of additive effects of serotonin transporter and tryptophan hydroxylase-2 polymorphisms on brain activation. In addition, simulated levels of cerebral spinal fluid 5-hydroxyindoleacetic acid (CSF 5-HIAA) were negatively correlated with Barratt Impulsiveness Scale (Version 11) Total scores in college students (r = −.22, p = .002, N = 187), which is consistent with the well-established negative correlation between CSF 5-HIAA and impulsivity. The results of the validation tests suggest that the model captures important aspects of the genetic control of presynaptic serotonergic function and behavior via brain activation. The proposed model can be: (1) extended to include other system components, neurotransmitter systems, behaviors and environmental influences; (2) used to generate testable hypotheses. PMID:20111992

  19. Technology Readiness of the NEXT Ion Propulsion System

    NASA Technical Reports Server (NTRS)

    Benson, Scott W.; Patterson, Michael J.

    2008-01-01

    The NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system has been in advanced technology development under the NASA In-Space Propulsion Technology project. The highest fidelity hardware planned has now been completed by the government/industry team, including: a flight prototype model (PM) thruster, an engineering model (EM) power processing unit, EM propellant management assemblies, a breadboard gimbal, and control unit simulators. Subsystem and system level technology validation testing is in progress. To achieve the objective Technology Readiness Level 6, environmental testing is being conducted to qualification levels in ground facilities simulating the space environment. Additional tests have been conducted to characterize the performance range and life capability of the NEXT thruster. This paper presents the status and results of technology validation testing accomplished to date, the validated subsystem and system capabilities, and the plans for completion of this phase of NEXT development. The next round of competed planetary science mission announcements of opportunity, and directed mission decisions, are anticipated to occur in 2008 and 2009. Progress to date, and the success of on-going technology validation, indicate that the NEXT ion propulsion system will be a primary candidate for mission consideration in these upcoming opportunities.

  20. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  1. Application of Petri net theory for modelling and validation of the sucrose breakdown pathway in the potato tuber.

    PubMed

    Koch, Ina; Junker, Björn H; Heiner, Monika

    2005-04-01

    Because of the complexity of metabolic networks and their regulation, formal modelling is a useful method to improve the understanding of these systems. An essential step in network modelling is to validate the network model. Petri net theory provides algorithms and methods, which can be applied directly to metabolic network modelling and analysis in order to validate the model. The metabolism between sucrose and starch in the potato tuber is of great research interest. Even if the metabolism is one of the best studied in sink organs, it is not yet fully understood. We provide an approach for model validation of metabolic networks using Petri net theory, which we demonstrate for the sucrose breakdown pathway in the potato tuber. We start with hierarchical modelling of the metabolic network as a Petri net and continue with the analysis of qualitative properties of the network. The results characterize the net structure and give insights into the complex net behaviour.

  2. Simulation of fMRI signals to validate dynamic causal modeling estimation

    NASA Astrophysics Data System (ADS)

    Anandwala, Mobin; Siadat, Mohamad-Reza; Hadi, Shamil M.

    2012-03-01

    Through cognitive tasks certain brain areas are activated and also receive increased blood to them. This is modeled through a state system consisting of two separate parts one that deals with the neural node stimulation and the other blood response during that stimulation. The rationale behind using this state system is to validate existing analysis methods such as DCM to see what levels of noise they can handle. Using the forward Euler's method this system was approximated in a series of difference equations. What was obtained was the hemodynamic response for each brain area and this was used to test an analysis tool to estimate functional connectivity between each brain area with a given amount of noise. The importance of modeling this system is to not only have a model for neural response but also to compare to actual data obtained through functional imaging scans.

  3. A mechanistic model for electricity consumption on dairy farms: definition, validation, and demonstration.

    PubMed

    Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat tariff. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  4. Uniting statistical and individual-based approaches for animal movement modelling.

    PubMed

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.

  5. Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling

    PubMed Central

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047

  6. Calibration and Validation of the Checkpoint Model to the Air Force Electronic Systems Center Software Database

    DTIC Science & Technology

    1997-09-01

    Illinois Institute of Technology Research Institute (IITRI) calibrated seven parametric models including SPQR /20, the forerunner of CHECKPOINT. The...a semicolon); thus, SPQR /20 was calibrated using SLOC sizing data (IITRI, 1989: 3-4). The results showed only slight overall improvements in accuracy...even when validating the calibrated models with the same data sets. The IITRI study demonstrated SPQR /20 to be one of two models that were most

  7. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    DTIC Science & Technology

    2015-03-01

    domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model

  8. Flight Testing an Iced Business Jet for Flight Simulation Model Validation

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon

    2007-01-01

    A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.

  9. Identifying model error in metabolic flux analysis - a generalized least squares approach.

    PubMed

    Sokolenko, Stanislav; Quattrociocchi, Marco; Aucoin, Marc G

    2016-09-13

    The estimation of intracellular flux through traditional metabolic flux analysis (MFA) using an overdetermined system of equations is a well established practice in metabolic engineering. Despite the continued evolution of the methodology since its introduction, there has been little focus on validation and identification of poor model fit outside of identifying "gross measurement error". The growing complexity of metabolic models, which are increasingly generated from genome-level data, has necessitated robust validation that can directly assess model fit. In this work, MFA calculation is framed as a generalized least squares (GLS) problem, highlighting the applicability of the common t-test for model validation. To differentiate between measurement and model error, we simulate ideal flux profiles directly from the model, perturb them with estimated measurement error, and compare their validation to real data. Application of this strategy to an established Chinese Hamster Ovary (CHO) cell model shows how fluxes validated by traditional means may be largely non-significant due to a lack of model fit. With further simulation, we explore how t-test significance relates to calculation error and show that fluxes found to be non-significant have 2-4 fold larger error (if measurement uncertainty is in the 5-10 % range). The proposed validation method goes beyond traditional detection of "gross measurement error" to identify lack of fit between model and data. Although the focus of this work is on t-test validation and traditional MFA, the presented framework is readily applicable to other regression analysis methods and MFA formulations.

  10. Participation in Decision Making as a Property of Complex Adaptive Systems: Developing and Testing a Measure

    PubMed Central

    Anderson, Ruth A.; Hsieh, Pi-Ching; Su, Hui Fang; Landerman, Lawrence R.; McDaniel, Reuben R.

    2013-01-01

    Objectives. To (1) describe participation in decision-making as a systems-level property of complex adaptive systems and (2) present empirical evidence of reliability and validity of a corresponding measure. Method. Study 1 was a mail survey of a single respondent (administrators or directors of nursing) in each of 197 nursing homes. Study 2 was a field study using random, proportionally stratified sampling procedure that included 195 organizations with 3,968 respondents. Analysis. In Study 1, we analyzed the data to reduce the number of scale items and establish initial reliability and validity. In Study 2, we strengthened the psychometric test using a large sample. Results. Results demonstrated validity and reliability of the participation in decision-making instrument (PDMI) while measuring participation of workers in two distinct job categories (RNs and CNAs). We established reliability at the organizational level aggregated items scores. We established validity of the multidimensional properties using convergent and discriminant validity and confirmatory factor analysis. Conclusions. Participation in decision making, when modeled as a systems-level property of organization, has multiple dimensions and is more complex than is being traditionally measured. Managers can use this model to form decision teams that maximize the depth and breadth of expertise needed and to foster connection among them. PMID:24349771

  11. Participation in decision making as a property of complex adaptive systems: developing and testing a measure.

    PubMed

    Anderson, Ruth A; Plowman, Donde; Corazzini, Kirsten; Hsieh, Pi-Ching; Su, Hui Fang; Landerman, Lawrence R; McDaniel, Reuben R

    2013-01-01

    Objectives. To (1) describe participation in decision-making as a systems-level property of complex adaptive systems and (2) present empirical evidence of reliability and validity of a corresponding measure. Method. Study 1 was a mail survey of a single respondent (administrators or directors of nursing) in each of 197 nursing homes. Study 2 was a field study using random, proportionally stratified sampling procedure that included 195 organizations with 3,968 respondents. Analysis. In Study 1, we analyzed the data to reduce the number of scale items and establish initial reliability and validity. In Study 2, we strengthened the psychometric test using a large sample. Results. Results demonstrated validity and reliability of the participation in decision-making instrument (PDMI) while measuring participation of workers in two distinct job categories (RNs and CNAs). We established reliability at the organizational level aggregated items scores. We established validity of the multidimensional properties using convergent and discriminant validity and confirmatory factor analysis. Conclusions. Participation in decision making, when modeled as a systems-level property of organization, has multiple dimensions and is more complex than is being traditionally measured. Managers can use this model to form decision teams that maximize the depth and breadth of expertise needed and to foster connection among them.

  12. Genomic Prediction Accounting for Genotype by Environment Interaction Offers an Effective Framework for Breeding Simultaneously for Adaptation to an Abiotic Stress and Performance Under Normal Cropping Conditions in Rice.

    PubMed

    Ben Hassen, Manel; Bartholomé, Jérôme; Valè, Giampiero; Cao, Tuong-Vi; Ahmadi, Nourollah

    2018-05-09

    Developing rice varieties adapted to alternate wetting and drying water management is crucial for the sustainability of irrigated rice cropping systems. Here we report the first study exploring the feasibility of breeding rice for adaptation to alternate wetting and drying using genomic prediction methods that account for genotype by environment interactions. Two breeding populations (a reference panel of 284 accessions and a progeny population of 97 advanced lines) were evaluated under alternate wetting and drying and continuous flooding management systems. The predictive ability of genomic prediction for response variables (index of relative performance and the slope of the joint regression) and for multi-environment genomic prediction models were compared. For the three traits considered (days to flowering, panicle weight and nitrogen-balance index), significant genotype by environment interactions were observed in both populations. In cross validation, predictive ability for the index was on average lower (0.31) than that of the slope of the joint regression (0.64) whatever the trait considered. Similar results were found for progeny validation. Both cross-validation and progeny validation experiments showed that the performance of multi-environment models predicting unobserved phenotypes of untested entrees was similar to the performance of single environment models with differences in predictive ability ranging from -6% to 4% depending on the trait and on the statistical model concerned. The predictive ability of multi-environment models predicting unobserved phenotypes of entrees evaluated under both water management systems outperformed single environment models by an average of 30%. Practical implications for breeding rice for adaptation to alternate wetting and drying system are discussed. Copyright © 2018, G3: Genes, Genomes, Genetics.

  13. Model Validation of an RSRM Transporter Through Full-scale Operational and Modal Testing

    NASA Technical Reports Server (NTRS)

    Brillhart, Ralph; Davis, Joshua; Allred, Bradley

    2009-01-01

    The Reusable Solid Rocket Motor (RSRM) segments, which are part of the current Space Shuttle system and will provide the first stage of the Ares launch vehicle, must be transported from their manufacturing facility in Promontory, Utah, to a railhead in Corinne, Utah. This approximately 25-mile trip on secondary paved roads is accomplished using a special transporter system which lifts and conveys each individual segment. ATK Launch Systems (ATK) has recently obtained a new set of these transporters from Scheuerle, a company in Germany. The transporter is a 96-wheel, dual tractor vehicle that supports the payload via a hydraulic suspension. Since this system is a different design than was previously used, computer modeling with validation via test is required to ensure that the environment to which the segment is exposed is not too severe for this space-critical hardware. Accurate prediction of the loads imparted to the rocket motor is essential in order to prevent damage to the segment. To develop and validate a finite element model capable of such accurate predictions, ATA Engineering, Inc., teamed with ATK to perform a modal survey of the transport system, including a forward RSRM segment. A set of electrodynamic shakers was placed around the transporter at locations capable of exciting the transporter vehicle dynamics. Forces from the shakers with varying phase combinations were applied using sinusoidal sweep excitation. The relative phase of the shaker forcing functions was adjusted to match the shape characteristics of each of several target modes, thereby customizing each sweep run for exciting a particular mode. The resulting frequency response functions (FRF) from this series of sine sweeps allowed identification of all target modes and other higher-order modes, allowing good comparison to the finite element model. Furthermore, the survey-derived modal frequencies were correlated with peak frequencies observed during road-going operating tests. This correlation enabled verification of the most significant modes contributing to real-world loading of the motor segment under transport. After traditional model updating, dynamic simulation of the transportation environment was compared to the measured operating data to provided further validation of the analysis model. KEYWORDS Validation, correlation, modal test, rocket motor, transporter

  14. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  15. Validating a biometric authentication system: sample size requirements.

    PubMed

    Dass, Sarat C; Zhu, Yongfang; Jain, Anil K

    2006-12-01

    Authentication systems based on biometric features (e.g., fingerprint impressions, iris scans, human face images, etc.) are increasingly gaining widespread use and popularity. Often, vendors and owners of these commercial biometric systems claim impressive performance that is estimated based on some proprietary data. In such situations, there is a need to independently validate the claimed performance levels. System performance is typically evaluated by collecting biometric templates from n different subjects, and for convenience, acquiring multiple instances of the biometric for each of the n subjects. Very little work has been done in 1) constructing confidence regions based on the ROC curve for validating the claimed performance levels and 2) determining the required number of biometric samples needed to establish confidence regions of prespecified width for the ROC curve. To simplify the analysis that address these two problems, several previous studies have assumed that multiple acquisitions of the biometric entity are statistically independent. This assumption is too restrictive and is generally not valid. We have developed a validation technique based on multivariate copula models for correlated biometric acquisitions. Based on the same model, we also determine the minimum number of samples required to achieve confidence bands of desired width for the ROC curve. We illustrate the estimation of the confidence bands as well as the required number of biometric samples using a fingerprint matching system that is applied on samples collected from a small population.

  16. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward

    2014-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and compared against each other. Results show both models can be tuned to achieve results within 7% of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  17. Development and Validation of Linear Alternator Models for the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2015-01-01

    Two models of the linear alternator of the Advanced Stirling Convertor (ASC) have been developed using the Sage 1-D modeling software package. The first model relates the piston motion to electric current by means of a motor constant. The second uses electromagnetic model components to model the magnetic circuit of the alternator. The models are tuned and validated using test data and also compared against each other. Results show both models can be tuned to achieve results within 7 of ASC test data under normal operating conditions. Using Sage enables the creation of a complete ASC model to be developed and simulations completed quickly compared to more complex multi-dimensional models. These models allow for better insight into overall Stirling convertor performance, aid with Stirling power system modeling, and in the future support NASA mission planning for Stirling-based power systems.

  18. Application of Game Theory to Improve the Defense of the Smart Grid

    DTIC Science & Technology

    2012-03-01

    Computer Systems and Networks ...............................................22 2.4.2 Trust Models ...systems. In this environment, developers assumed deterministic communications mediums rather than the “best effort” models provided in most modern... models or computational models to validate the SPSs design. Finally, the study reveals concerns about the performance of load rejection schemes

  19. SU-F-J-41: Experimental Validation of a Cascaded Linear System Model for MVCBCT with a Multi-Layer EPID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Y; Rottmann, J; Myronakis, M

    2016-06-15

    Purpose: The purpose of this study was to validate the use of a cascaded linear system model for MV cone-beam CT (CBCT) using a multi-layer (MLI) electronic portal imaging device (EPID) and provide experimental insight into image formation. A validated 3D model provides insight into salient factors affecting reconstructed image quality, allowing potential for optimizing detector design for CBCT applications. Methods: A cascaded linear system model was developed to investigate the potential improvement in reconstructed image quality for MV CBCT using an MLI EPID. Inputs to the three-dimensional (3D) model include projection space MTF and NPS. Experimental validation was performedmore » on a prototype MLI detector installed on the portal imaging arm of a Varian TrueBeam radiotherapy system. CBCT scans of up to 898 projections over 360 degrees were acquired at exposures of 16 and 64 MU. Image volumes were reconstructed using a Feldkamp-type (FDK) filtered backprojection (FBP) algorithm. Flat field images and scans of a Catphan model 604 phantom were acquired. The effect of 2×2 and 4×4 detector binning was also examined. Results: Using projection flat fields as an input, examination of the modeled and measured NPS in the axial plane exhibits good agreement. Binning projection images was shown to improve axial slice SDNR by a factor of approximately 1.4. This improvement is largely driven by a decrease in image noise of roughly 20%. However, this effect is accompanied by a subsequent loss in image resolution. Conclusion: The measured axial NPS shows good agreement with the theoretical calculation using a linear system model. Binning of projection images improves SNR of large objects on the Catphan phantom by decreasing noise. Specific imaging tasks will dictate the implementation image binning to two-dimensional projection images. The project was partially supported by a grant from Varian Medical Systems, Inc. and grant No. R01CA188446-01 from the National Cancer Institute.« less

  20. A ferrofluid based energy harvester: Computational modeling, analysis, and experimental validation

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Alazemi, Saad F.; Daqaq, Mohammed F.; Li, Gang

    2018-03-01

    A computational model is described and implemented in this work to analyze the performance of a ferrofluid based electromagnetic energy harvester. The energy harvester converts ambient vibratory energy into an electromotive force through a sloshing motion of a ferrofluid. The computational model solves the coupled Maxwell's equations and Navier-Stokes equations for the dynamic behavior of the magnetic field and fluid motion. The model is validated against experimental results for eight different configurations of the system. The validated model is then employed to study the underlying mechanisms that determine the electromotive force of the energy harvester. Furthermore, computational analysis is performed to test the effect of several modeling aspects, such as three-dimensional effect, surface tension, and type of the ferrofluid-magnetic field coupling on the accuracy of the model prediction.

  1. Agent-Based Simulation for Interconnection-Scale Renewable Integration and Demand Response Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Behboodi, Sahand; Crawford, Curran

    This paper collects and synthesizes the technical requirements, implementation, and validation methods for quasi-steady agent-based simulations of interconnectionscale models with particular attention to the integration of renewable generation and controllable loads. Approaches for modeling aggregated controllable loads are presented and placed in the same control and economic modeling framework as generation resources for interconnection planning studies. Model performance is examined with system parameters that are typical for an interconnection approximately the size of the Western Electricity Coordinating Council (WECC) and a control area about 1/100 the size of the system. These results are used to demonstrate and validate the methodsmore » presented.« less

  2. Agent-Based Simulation for Interconnection-Scale Renewable Integration and Demand Response Studies

    DOE PAGES

    Chassin, David P.; Behboodi, Sahand; Crawford, Curran; ...

    2015-12-23

    This paper collects and synthesizes the technical requirements, implementation, and validation methods for quasi-steady agent-based simulations of interconnectionscale models with particular attention to the integration of renewable generation and controllable loads. Approaches for modeling aggregated controllable loads are presented and placed in the same control and economic modeling framework as generation resources for interconnection planning studies. Model performance is examined with system parameters that are typical for an interconnection approximately the size of the Western Electricity Coordinating Council (WECC) and a control area about 1/100 the size of the system. These results are used to demonstrate and validate the methodsmore » presented.« less

  3. Analysis, testing, and evaluation of faulted and unfaulted Wye, Delta, and open Delta connected electromechanical actuators

    NASA Technical Reports Server (NTRS)

    Nehl, T. W.; Demerdash, N. A.

    1983-01-01

    Mathematical models capable of simulating the transient, steady state, and faulted performance characteristics of various brushless dc machine-PSA (power switching assembly) configurations were developed. These systems are intended for possible future use as primemovers in EMAs (electromechanical actuators) for flight control applications. These machine-PSA configurations include wye, delta, and open-delta connected systems. The research performed under this contract was initially broken down into the following six tasks: development of mathematical models for various machine-PSA configurations; experimental validation of the model for failure modes; experimental validation of the mathematical model for shorted turn-failure modes; tradeoff study; and documentation of results and methodology.

  4. Verification and Validation for Flight-Critical Systems (VVFCS)

    NASA Technical Reports Server (NTRS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  5. Evaluation of the lambda model for human postural control during ankle strategy.

    PubMed

    Micheau, Philippe; Kron, Aymeric; Bourassa, Paul

    2003-09-01

    An accurate modeling of human stance might be helpful in assessing postural deficit. The objective of this article is to validate a mathematical postural control model for quiet standing posture. The postural dynamics is modeled in the sagittal plane as an inverted pendulum with torque applied at the ankle joint. The torque control system is represented by the physiological lambda model. Two neurophysiological command variables of the central nervous system, designated lambda and micro, establish the dynamic threshold muscle at which motoneuron recruitment begins. Kinematic data and electromyographic signals were collected on four young males in order to measure small voluntary sway and quiet standing posture. Validation of the mathematical model was achieved through comparison of the experimental and simulated results. The mathematical model allows computation of the unmeasurable neurophysiological commands lambda and micro that control the equilibrium position and stability. Furthermore, with the model it is possible to conclude that low-amplitude body sway during quiet stance is commanded by the central nervous system.

  6. Langley 16- Ft. Transonic Tunnel Pressure Sensitive Paint System

    NASA Technical Reports Server (NTRS)

    Sprinkle, Danny R.; Obara, Clifford J.; Amer, Tahani R.; Leighty, Bradley D.; Carmine, Michael T.; Sealey, Bradley S.; Burkett, Cecil G.

    2001-01-01

    This report describes the NASA Langley 16-Ft. Transonic Tunnel Pressure Sensitive Paint (PSP) System and presents results of a test conducted June 22-23, 2000 in the tunnel to validate the PSP system. The PSP system provides global surface pressure measurements on wind tunnel models. The system was developed and installed by PSP Team personnel of the Instrumentation Systems Development Branch and the Advanced Measurement and Diagnostics Branch. A discussion of the results of the validation test follows a description of the system and a description of the test.

  7. Servo-hydraulic actuator in controllable canonical form: Identification and experimental validation

    NASA Astrophysics Data System (ADS)

    Maghareh, Amin; Silva, Christian E.; Dyke, Shirley J.

    2018-02-01

    Hydraulic actuators have been widely used to experimentally examine structural behavior at multiple scales. Real-time hybrid simulation (RTHS) is one innovative testing method that largely relies on such servo-hydraulic actuators. In RTHS, interface conditions must be enforced in real time, and controllers are often used to achieve tracking of the desired displacements. Thus, neglecting the dynamics of hydraulic transfer system may result either in system instability or sub-optimal performance. Herein, we propose a nonlinear dynamical model for a servo-hydraulic actuator (a.k.a. hydraulic transfer system) coupled with a nonlinear physical specimen. The nonlinear dynamical model is transformed into controllable canonical form for further tracking control design purposes. Through a number of experiments, the controllable canonical model is validated.

  8. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  9. Designing an Agent-Based Model Using Group Model Building: Application to Food Insecurity Patterns in a U.S. Midwestern Metropolitan City.

    PubMed

    Koh, Keumseok; Reno, Rebecca; Hyder, Ayaz

    2018-04-01

    Recent advances in computing resources have increased interest in systems modeling and population health. While group model building (GMB) has been effectively applied in developing system dynamics models (SD), few studies have used GMB for developing an agent-based model (ABM). This article explores the use of a GMB approach to develop an ABM focused on food insecurity. In our GMB workshops, we modified a set of the standard GMB scripts to develop and validate an ABM in collaboration with local experts and stakeholders. Based on this experience, we learned that GMB is a useful collaborative modeling platform for modelers and community experts to address local population health issues. We also provide suggestions for increasing the use of the GMB approach to develop rigorous, useful, and validated ABMs.

  10. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  11. Operational characterisation of requirements and early validation environment for high demanding space systems

    NASA Technical Reports Server (NTRS)

    Barro, E.; Delbufalo, A.; Rossi, F.

    1993-01-01

    The definition of some modern high demanding space systems requires a different approach to system definition and design from that adopted for traditional missions. System functionality is strongly coupled to the operational analysis, aimed at characterizing the dynamic interactions of the flight element with its surrounding environment and its ground control segment. Unambiguous functional, operational and performance requirements are to be defined for the system, thus improving also the successive development stages. This paper proposes a Petri Nets based methodology and two related prototype applications (to ARISTOTELES orbit control and to Hermes telemetry generation) for the operational analysis of space systems through the dynamic modeling of their functions and a related computer aided environment (ISIDE) able to make the dynamic model work, thus enabling an early validation of the system functional representation, and to provide a structured system requirements data base, which is the shared knowledge base interconnecting static and dynamic applications, fully traceable with the models and interfaceable with the external world.

  12. Evaluation of dynamical models: dissipative synchronization and other techniques.

    PubMed

    Aguirre, Luis Antonio; Furtado, Edgar Campos; Tôrres, Leonardo A B

    2006-12-01

    Some recent developments for the validation of nonlinear models built from data are reviewed. Besides giving an overall view of the field, a procedure is proposed and investigated based on the concept of dissipative synchronization between the data and the model, which is very useful in validating models that should reproduce dominant dynamical features, like bifurcations, of the original system. In order to assess the discriminating power of the procedure, four well-known benchmarks have been used: namely, Duffing-Ueda, Duffing-Holmes, and van der Pol oscillators, plus the Hénon map. The procedure, developed for discrete-time systems, is focused on the dynamical properties of the model, rather than on statistical issues. For all the systems investigated, it is shown that the discriminating power of the procedure is similar to that of bifurcation diagrams--which in turn is much greater than, say, that of correlation dimension--but at a much lower computational cost.

  13. Development of an Effective System Identification and Control Capability for Quad-copter UAVs

    NASA Astrophysics Data System (ADS)

    Wei, Wei

    In recent years, with the promise of extensive commercial applications, the popularity of Unmanned Aerial Vehicles (UAVs) has dramatically increased as witnessed by publications and mushrooming research and educational programs. Over the years, multi-copter aircraft have been chosen as a viable configuration for small-scale VTOL UAVs in the form of quad-copters, hexa-copters and octo-copters. Compared to the single main rotor configuration such as the conventional helicopter, multi-copter airframes require a simpler feedback control system and fewer mechanical parts. These characteristics make these UAV platforms, such as quad-copter which is the main emphasis in this dissertation, a rugged and competitive candidate for many applications in both military and civil areas. Because of its configuration and relative size, the small-scale quad-copter UAV system is inherently very unstable. In order to develop an effective control system through simulation techniques, obtaining an accurate dynamic model of a given quad-copter is imperative. Moreover, given the anticipated stringent safety requirements, fault tolerance will be a crucial component of UAV certification. Accurate dynamic modeling and control of this class of UAV is an enabling technology and is imperative for future commercial applications. In this work, the dynamic model of a quad-copter system in hover flight was identified using frequency-domain system identification techniques. A new and unique experimental system, data acquisition and processing procedure was developed catering specifically to the class of electric powered multi-copter UAV systems. The Comprehensive Identification from FrEquency Responses (CIFER RTM) software package, developed by US Army Aviation Development Directorate -- AFDD, was utilized along with flight tests to develop dynamic models of the quad-copter system. A new set of flight tests were conducted and the predictive capability of the dynamic models were successfully validated. A PID controller and two fuzzy logic controllers were developed based on the validated dynamic models. The controller performances were evaluated and compared in both simulation environment and flight testing. Flight controllers were optimized to comply with US Aeronautical Design Standard Performance Specification Handling Quality Requirements for Military Rotorcraft (ADS-33E-PRF). Results showed a substantial improvement for developed controllers when compared to the nominal controllers based on hand tuning. The scope of this research involves experimental system hardware and software development, flight instrumentation, flight testing, dynamics modeling, system identification, dynamic model validation, control system modeling using PID and fuzzy logic, analysis of handling qualities, flight control optimization and validation. Both closed-loop and open-loop dynamics of the quad-copter system were analyzed. A cost-effective and high quality system identification procedure was applied and results proved in simulations as well as in flight tests.

  14. Developing and validating a model to predict the success of an IHCS implementation: the Readiness for Implementation Model.

    PubMed

    Wen, Kuang-Yi; Gustafson, David H; Hawkins, Robert P; Brennan, Patricia F; Dinauer, Susan; Johnson, Pauley R; Siegler, Tracy

    2010-01-01

    To develop and validate the Readiness for Implementation Model (RIM). This model predicts a healthcare organization's potential for success in implementing an interactive health communication system (IHCS). The model consists of seven weighted factors, with each factor containing five to seven elements. Two decision-analytic approaches, self-explicated and conjoint analysis, were used to measure the weights of the RIM with a sample of 410 experts. The RIM model with weights was then validated in a prospective study of 25 IHCS implementation cases. Orthogonal main effects design was used to develop 700 conjoint-analysis profiles, which varied on seven factors. Each of the 410 experts rated the importance and desirability of the factors and their levels, as well as a set of 10 different profiles. For the prospective 25-case validation, three time-repeated measures of the RIM scores were collected for comparison with the implementation outcomes. Two of the seven factors, 'organizational motivation' and 'meeting user needs,' were found to be most important in predicting implementation readiness. No statistically significant difference was found in the predictive validity of the two approaches (self-explicated and conjoint analysis). The RIM was a better predictor for the 1-year implementation outcome than the half-year outcome. The expert sample, the order of the survey tasks, the additive model, and basing the RIM cut-off score on experience are possible limitations of the study. The RIM needs to be empirically evaluated in institutions adopting IHCS and sustaining the system in the long term.

  15. Simulation of Climate Change Impacts on Wheat-Fallow Cropping Systems

    USDA-ARS?s Scientific Manuscript database

    Agricultural system simulation models are predictive tools for assessing climate change impacts on crop production. In this study, RZWQM2 that contains the DSSAT 4.0-CERES model was evaluated for simulating climate change impacts on wheat growth. The model was calibrated and validated using data fro...

  16. System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling

    DOE PAGES

    Bacelli, Giorgio; Coe, Ryan; Patterson, David; ...

    2017-04-01

    Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less

  17. System Identification of a Heaving Point Absorber: Design of Experiment and Device Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bacelli, Giorgio; Coe, Ryan; Patterson, David

    Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followedmore » for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. Furthermore, while most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.« less

  18. Technical Note: Procedure for the calibration and validation of kilo-voltage cone-beam CT models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vilches-Freixas, Gloria; Létang, Jean Michel; Rit,

    2016-09-15

    Purpose: The aim of this work is to propose a general and simple procedure for the calibration and validation of kilo-voltage cone-beam CT (kV CBCT) models against experimental data. Methods: The calibration and validation of the CT model is a two-step procedure: the source model then the detector model. The source is described by the direction dependent photon energy spectrum at each voltage while the detector is described by the pixel intensity value as a function of the direction and the energy of incident photons. The measurements for the source consist of a series of dose measurements in air performedmore » at each voltage with varying filter thicknesses and materials in front of the x-ray tube. The measurements for the detector are acquisitions of projection images using the same filters and several tube voltages. The proposed procedure has been applied to calibrate and assess the accuracy of simple models of the source and the detector of three commercial kV CBCT units. If the CBCT system models had been calibrated differently, the current procedure would have been exclusively used to validate the models. Several high-purity attenuation filters of aluminum, copper, and silver combined with a dosimeter which is sensitive to the range of voltages of interest were used. A sensitivity analysis of the model has also been conducted for each parameter of the source and the detector models. Results: Average deviations between experimental and theoretical dose values are below 1.5% after calibration for the three x-ray sources. The predicted energy deposited in the detector agrees with experimental data within 4% for all imaging systems. Conclusions: The authors developed and applied an experimental procedure to calibrate and validate any model of the source and the detector of a CBCT unit. The present protocol has been successfully applied to three x-ray imaging systems. The minimum requirements in terms of material and equipment would make its implementation suitable in most clinical environments.« less

  19. Validation of a FAST model of the Statoil-Hywind Demo floating wind turbine

    DOE PAGES

    Driscoll, Frederick; Jonkman, Jason; Robertson, Amy; ...

    2016-10-13

    To assess the accuracy of the National Renewable Energy Laboratory's (NREL's) FAST simulation tool for modeling the coupled response of floating offshore wind turbines under realistic open-ocean conditions, NREL developed a FAST model of the Statoil Hywind Demo floating offshore wind turbine, and validated simulation results against field measurements. Field data were provided by Statoil, which conducted a comprehensive test measurement campaign of its demonstration system, a 2.3-MW Siemens turbine mounted on a spar substructure deployed about 10 km off the island of Karmoy in Norway. A top-down approach was used to develop the FAST model, starting with modeling themore » blades and working down to the mooring system. Design data provided by Siemens and Statoil were used to specify the structural, aerodynamic, and dynamic properties. Measured wind speeds and wave spectra were used to develop the wind and wave conditions used in the model. The overall system performance and behavior were validated for eight sets of field measurements that span a wide range of operating conditions. The simulated controller response accurately reproduced the measured blade pitch and power. In conclusion, the structural and blade loads and spectra of platform motion agree well with the measured data.« less

  20. Development and validation of a modified Hybrid-III six-year-old dummy model for simulating submarining in motor-vehicle crashes.

    PubMed

    Hu, Jingwen; Klinich, Kathleen D; Reed, Matthew P; Kokkolaras, Michael; Rupp, Jonathan D

    2012-06-01

    In motor-vehicle crashes, young school-aged children restrained by vehicle seat belt systems often suffer from abdominal injuries due to submarining. However, the current anthropomorphic test device, so-called "crash dummy", is not adequate for proper simulation of submarining. In this study, a modified Hybrid-III six-year-old dummy model capable of simulating and predicting submarining was developed using MADYMO (TNO Automotive Safety Solutions). The model incorporated improved pelvis and abdomen geometry and properties previously tested in a modified physical dummy. The model was calibrated and validated against four sled tests under two test conditions with and without submarining using a multi-objective optimization method. A sensitivity analysis using this validated child dummy model showed that dummy knee excursion, torso rotation angle, and the difference between head and knee excursions were good predictors for submarining status. It was also shown that restraint system design variables, such as lap belt angle, D-ring height, and seat coefficient of friction (COF), may have opposite effects on head and abdomen injury risks; therefore child dummies and dummy models capable of simulating submarining are crucial for future restraint system design optimization for young school-aged children. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.

  1. Infrared thermography for detection of laminar-turbulent transition in low-speed wind tunnel testing

    NASA Astrophysics Data System (ADS)

    Joseph, Liselle A.; Borgoltz, Aurelien; Devenport, William

    2016-05-01

    This work presents the details of a system for experimentally identifying laminar-to-turbulent transition using infrared thermography applied to large, metal models in low-speed wind tunnel tests. Key elements of the transition detection system include infrared cameras with sensitivity in the 7.5- to 14.0-µm spectral range and a thin, insulating coat for the model. The fidelity of the system was validated through experiments on two wind-turbine blade airfoil sections tested at Reynolds numbers between Re = 1.5 × 106 and 3 × 106. Results compare well with measurements from surface pressure distributions and stethoscope observations. However, the infrared-based system provides data over a much broader range of conditions and locations on the model. This paper chronicles the design, implementation and validation of the infrared transition detection system, a subject which has not been widely detailed in the literature to date.

  2. MATLAB/Simulink Pulse-Echo Ultrasound System Simulator Based on Experimentally Validated Models.

    PubMed

    Kim, Taehoon; Shin, Sangmin; Lee, Hyongmin; Lee, Hyunsook; Kim, Heewon; Shin, Eunhee; Kim, Suhwan

    2016-02-01

    A flexible clinical ultrasound system must operate with different transducers, which have characteristic impulse responses and widely varying impedances. The impulse response determines the shape of the high-voltage pulse that is transmitted and the specifications of the front-end electronics that receive the echo; the impedance determines the specification of the matching network through which the transducer is connected. System-level optimization of these subsystems requires accurate modeling of pulse-echo (two-way) response, which in turn demands a unified simulation of the ultrasonics and electronics. In this paper, this is realized by combining MATLAB/Simulink models of the high-voltage transmitter, the transmission interface, the acoustic subsystem which includes wave propagation and reflection, the receiving interface, and the front-end receiver. To demonstrate the effectiveness of our simulator, the models are experimentally validated by comparing the simulation results with the measured data from a commercial ultrasound system. This simulator could be used to quickly provide system-level feedback for an optimized tuning of electronic design parameters.

  3. On the validity of specific rate constants (kSA) in Fe0/H2O systems.

    PubMed

    Noubactep, C

    2009-05-30

    The validity of the specific reaction rate constants (k(SA)) in modelling contaminant removal in Fe(0)/H(2)O systems is questioned. It is shown that the current k(SA)-model does not consider the large reactive surface area provided by the in-situ formed oxide film, and thus the adsorptive interactions between contaminants and film materials. Furthermore, neither the dynamic nature of film formation nor the fact that the Fe(0) surface is shielded by the film is considered. Suggestions are made how the k(SA)-model could be further developed to meet its original goal.

  4. FIRE Science Results 1989

    NASA Technical Reports Server (NTRS)

    Mcdougal, David S. (Editor)

    1990-01-01

    FIRE (First ISCCP Regional Experiment) is a U.S. cloud-radiation research program formed in 1984 to increase the basic understanding of cirrus and marine stratocumulus cloud systems, to develop realistic parameterizations for these systems, and to validate and improve ISCCP cloud product retrievals. Presentations of results culminating the first 5 years of FIRE research activities were highlighted. The 1986 Cirrus Intensive Field Observations (IFO), the 1987 Marine Stratocumulus IFO, the Extended Time Observations (ETO), and modeling activities are described. Collaborative efforts involving the comparison of multiple data sets, incorporation of data measurements into modeling activities, validation of ISCCP cloud parameters, and development of parameterization schemes for General Circulation Models (GCMs) are described.

  5. Application of a Computer Model to Various Specifications of Fuel Injection System for DI Diesel Engines

    NASA Astrophysics Data System (ADS)

    Yamanishi, Manabu

    A combined experimental and computational investigation was performed in order to evaluate the effects of various design parameters of an in-line injection pump on the nozzle exit characteristics for DI diesel engines. Measurements of the pump chamber pressure and the delivery valve lift were included for validation by using specially designed transducers installed inside the pump. The results confirm that the simulation model is capable of predicting the pump operation for all the different designs investigated pump operating conditions. Following the successful validation of this model, parametric studies were performed which allow for improved fuel injection system design.

  6. How is the surface Atlantic water inflow through the Gibraltar Strait forecasted? A lagrangian validation of operational oceanographic services in the Alboran Sea and the Western Mediterranean

    NASA Astrophysics Data System (ADS)

    Sotillo, M. G.; Amo-Baladrón, A.; Padorno, E.; Garcia-Ladona, E.; Orfila, A.; Rodríguez-Rubio, P.; Conti, D.; Madrid, J. A. Jiménez; de los Santos, F. J.; Fanjul, E. Alvarez

    2016-11-01

    An exhaustive validation of some of the operational ocean forecast products available in the Gibraltar Strait and the Alboran Sea is here presented. The skill of two ocean model solutions (derived from the Eulerian ocean forecast systems, such as the regional CMEMS IBI and the high resolution PdE SAMPA) in reproducing the complex surface dynamics in the above areas is evaluated. To this aim, in-situ measurements from the MEDESS-GIB drifter buoy database (comprising the Lagrangian positions, derived velocities and SST values) are used as the observational reference and the temporal coverage for the validation is 3 months (September to December 2014). Two metrics, a Lagrangian separation distance and a skill score, have been applied to evaluate the performance of the modelling systems in reproducing the observed trajectories. Furthermore, the SST validation with in-situ data is carried out by means of validating the model solutions with L3 satellite SST products. The Copernicus regional IBI products are evaluated in an extended domain, beyond the Alboran Sea, and covering western Mediterranean waters. This analysis reveals some strengths of the presented regional solution (i.e. realistic values of the Atlantic Jet in the Strait of Gibraltar area, realistic simulation of the Algerian Current). However, some shortcomings are also identified, with the major one being related to the simulated geographical position and intensity of the Alboran Gyres, particularly the western one. This performance limitation affects the IBI-modelled surface circulation in the entire Alboran Sea. On the other hand, the SAMPA system shows a more accurate model performance and it realistically reproduces the observed surface circulation in the area. The results reflect the effectiveness of the dynamical downscaling performed through the SAMPA system with respect to the regional IBI solution (in which SAMPA is nested), providing an objective measure of the potential added values introduced by the SAMPA downscaling solution in the Alboran Sea.

  7. Multi-Node Thermal System Model for Lithium-Ion Battery Packs: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Ying; Smith, Kandler; Wood, Eric

    Temperature is one of the main factors that controls the degradation in lithium ion batteries. Accurate knowledge and control of cell temperatures in a pack helps the battery management system (BMS) to maximize cell utilization and ensure pack safety and service life. In a pack with arrays of cells, a cells temperature is not only affected by its own thermal characteristics but also by its neighbors, the cooling system and pack configuration, which increase the noise level and the complexity of cell temperatures prediction. This work proposes to model lithium ion packs thermal behavior using a multi-node thermal network model,more » which predicts the cell temperatures by zones. The model was parametrized and validated using commercial lithium-ion battery packs. neighbors, the cooling system and pack configuration, which increase the noise level and the complexity of cell temperatures prediction. This work proposes to model lithium ion packs thermal behavior using a multi-node thermal network model, which predicts the cell temperatures by zones. The model was parametrized and validated using commercial lithium-ion battery packs.« less

  8. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2014-05-20

    but there can still be many recommendations generated. Therefore, the recommender results are displayed in a sortable table where each row is a...reporting period. Since the synthesis graph can be complex and have many dependencies, the system must determine the order of evaluation of nodes, and...validation failure, if any. 3.1. Automatic Feature Extraction In many domains, causal models can often be more readily described as patterns of

  9. Proposal and validation of a new model to estimate survival for hepatocellular carcinoma patients.

    PubMed

    Liu, Po-Hong; Hsu, Chia-Yang; Hsia, Cheng-Yuan; Lee, Yun-Hsuan; Huang, Yi-Hsiang; Su, Chien-Wei; Lee, Fa-Yauh; Lin, Han-Chieh; Huo, Teh-Ia

    2016-08-01

    The survival of hepatocellular carcinoma (HCC) patients is heterogeneous. We aim to develop and validate a simple prognostic model to estimate survival for HCC patients (MESH score). A total of 3182 patients were randomised into derivation and validation cohort. Multivariate analysis was used to identify independent predictors of survival in the derivation cohort. The validation cohort was employed to examine the prognostic capabilities. The MESH score allocated 1 point for each of the following parameters: large tumour (beyond Milan criteria), presence of vascular invasion or metastasis, Child-Turcotte-Pugh score ≥6, performance status ≥2, serum alpha-fetoprotein level ≥20 ng/ml, and serum alkaline phosphatase ≥200 IU/L, with a maximal of 6 points. In the validation cohort, significant survival differences were found across all MESH scores from 0 to 6 (all p < 0.01). The MESH system was associated with the highest homogeneity and lowest corrected Akaike information criterion compared with Barcelona Clínic Liver Cancer, Hong Kong Liver Cancer (HKLC), Cancer of the Liver Italian Program, Taipei Integrated Scoring and model to estimate survival in ambulatory HCC Patients systems. The prognostic accuracy of the MESH scores remained constant in patients with hepatitis B- or hepatitis C-related HCC. The MESH score can also discriminate survival for patients from early to advanced stages of HCC. This newly proposed simple and accurate survival model provides enhanced prognostic accuracy for HCC. The MESH system is a useful supplement to the BCLC and HKLC classification schemes in refining treatment strategies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Construction, internal validation and implementation in a mobile application of a scoring system to predict nonadherence to proton pump inhibitors.

    PubMed

    Mares-García, Emma; Palazón-Bru, Antonio; Folgado-de la Rosa, David Manuel; Pereira-Expósito, Avelino; Martínez-Martín, Álvaro; Cortés-Castell, Ernesto; Gil-Guillén, Vicente Francisco

    2017-01-01

    Other studies have assessed nonadherence to proton pump inhibitors (PPIs), but none has developed a screening test for its detection. To construct and internally validate a predictive model for nonadherence to PPIs. This prospective observational study with a one-month follow-up was carried out in 2013 in Spain, and included 302 patients with a prescription for PPIs. The primary variable was nonadherence to PPIs (pill count). Secondary variables were gender, age, antidepressants, type of PPI, non-guideline-recommended prescription (NGRP) of PPIs, and total number of drugs. With the secondary variables, a binary logistic regression model to predict nonadherence was constructed and adapted to a points system. The ROC curve, with its area (AUC), was calculated and the optimal cut-off point was established. The points system was internally validated through 1,000 bootstrap samples and implemented in a mobile application (Android). The points system had three prognostic variables: total number of drugs, NGRP of PPIs, and antidepressants. The AUC was 0.87 (95% CI [0.83-0.91], p  < 0.001). The test yielded a sensitivity of 0.80 (95% CI [0.70-0.87]) and a specificity of 0.82 (95% CI [0.76-0.87]). The three parameters were very similar in the bootstrap validation. A points system to predict nonadherence to PPIs has been constructed, internally validated and implemented in a mobile application. Provided similar results are obtained in external validation studies, we will have a screening tool to detect nonadherence to PPIs.

  11. Formal specification and design techniques for wireless sensor and actuator networks.

    PubMed

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system.

  12. Analysis of various quality attributes of sunflower and soybean plants by near infra-red reflectance spectroscopy: Development and validation calibration models

    USDA-ARS?s Scientific Manuscript database

    Sunflower and soybean are summer annuals that can be grown as an alternative to corn and may be particularly useful in organic production systems. Rapid and low cost methods of analyzing plant quality would be helpful for crop management. We developed and validated calibration models for Near-infrar...

  13. Accelerated Aging in Electrolytic Capacitors for Prognostics

    NASA Technical Reports Server (NTRS)

    Celaya, Jose R.; Kulkarni, Chetan; Saha, Sankalita; Biswas, Gautam; Goebel, Kai Frank

    2012-01-01

    The focus of this work is the analysis of different degradation phenomena based on thermal overstress and electrical overstress accelerated aging systems and the use of accelerated aging techniques for prognostics algorithm development. Results on thermal overstress and electrical overstress experiments are presented. In addition, preliminary results toward the development of physics-based degradation models are presented focusing on the electrolyte evaporation failure mechanism. An empirical degradation model based on percentage capacitance loss under electrical overstress is presented and used in: (i) a Bayesian-based implementation of model-based prognostics using a discrete Kalman filter for health state estimation, and (ii) a dynamic system representation of the degradation model for forecasting and remaining useful life (RUL) estimation. A leave-one-out validation methodology is used to assess the validity of the methodology under the small sample size constrain. The results observed on the RUL estimation are consistent through the validation tests comparing relative accuracy and prediction error. It has been observed that the inaccuracy of the model to represent the change in degradation behavior observed at the end of the test data is consistent throughout the validation tests, indicating the need of a more detailed degradation model or the use of an algorithm that could estimate model parameters on-line. Based on the observed degradation process under different stress intensity with rest periods, the need for more sophisticated degradation models is further supported. The current degradation model does not represent the capacitance recovery over rest periods following an accelerated aging stress period.

  14. Reverse engineering biomolecular systems using -omic data: challenges, progress and opportunities.

    PubMed

    Quo, Chang F; Kaddi, Chanchala; Phan, John H; Zollanvari, Amin; Xu, Mingqing; Wang, May D; Alterovitz, Gil

    2012-07-01

    Recent advances in high-throughput biotechnologies have led to the rapid growing research interest in reverse engineering of biomolecular systems (REBMS). 'Data-driven' approaches, i.e. data mining, can be used to extract patterns from large volumes of biochemical data at molecular-level resolution while 'design-driven' approaches, i.e. systems modeling, can be used to simulate emergent system properties. Consequently, both data- and design-driven approaches applied to -omic data may lead to novel insights in reverse engineering biological systems that could not be expected before using low-throughput platforms. However, there exist several challenges in this fast growing field of reverse engineering biomolecular systems: (i) to integrate heterogeneous biochemical data for data mining, (ii) to combine top-down and bottom-up approaches for systems modeling and (iii) to validate system models experimentally. In addition to reviewing progress made by the community and opportunities encountered in addressing these challenges, we explore the emerging field of synthetic biology, which is an exciting approach to validate and analyze theoretical system models directly through experimental synthesis, i.e. analysis-by-synthesis. The ultimate goal is to address the present and future challenges in reverse engineering biomolecular systems (REBMS) using integrated workflow of data mining, systems modeling and synthetic biology.

  15. Testability of evolutionary game dynamics based on experimental economics data

    NASA Astrophysics Data System (ADS)

    Wang, Yijia; Chen, Xiaojie; Wang, Zhijian

    2017-11-01

    Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.

  16. Development and validation of quasi-steady-state heat pump water heater model having stratified water tank and wrapped-tank condenser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Bo; Nawaz, Kashif; Baxter, Van D.

    Heat pump water heater systems (HPWH) introduce new challenges for design and modeling tools, because they require vapor compression system balanced with a water storage tank. In addition, a wrapped-tank condenser coil has strong coupling with a stratified water tank, which leads HPWH simulation to a transient process. To tackle these challenges and deliver an effective, hardware-based HPWH equipment design tool, a quasi-steady-state HPWH model was developed based on the DOE/ORNL Heat Pump Design Model (HPDM). Two new component models were added via this study. One is a one-dimensional stratified water tank model, an improvement to the open-source EnergyPlus watermore » tank model, by introducing a calibration factor to account for bulk mixing effect due to water draws, circulations, etc. The other is a wrapped-tank condenser coil model, using a segment-to-segment modeling approach. In conclusion, the HPWH system model was validated against available experimental data. After that, the model was used for parametric simulations to determine the effects of various design factors.« less

  17. Development and validation of quasi-steady-state heat pump water heater model having stratified water tank and wrapped-tank condenser

    DOE PAGES

    Shen, Bo; Nawaz, Kashif; Baxter, Van D.; ...

    2017-10-31

    Heat pump water heater systems (HPWH) introduce new challenges for design and modeling tools, because they require vapor compression system balanced with a water storage tank. In addition, a wrapped-tank condenser coil has strong coupling with a stratified water tank, which leads HPWH simulation to a transient process. To tackle these challenges and deliver an effective, hardware-based HPWH equipment design tool, a quasi-steady-state HPWH model was developed based on the DOE/ORNL Heat Pump Design Model (HPDM). Two new component models were added via this study. One is a one-dimensional stratified water tank model, an improvement to the open-source EnergyPlus watermore » tank model, by introducing a calibration factor to account for bulk mixing effect due to water draws, circulations, etc. The other is a wrapped-tank condenser coil model, using a segment-to-segment modeling approach. In conclusion, the HPWH system model was validated against available experimental data. After that, the model was used for parametric simulations to determine the effects of various design factors.« less

  18. Multi-Evaporator Miniature Loop Heat Pipe for Small Spacecraft Thermal Control. Part 1; New Technologies and Validation Approach

    NASA Technical Reports Server (NTRS)

    Ku, Jentung; Ottenstein, Laura; Douglas, Donya; Hoang, Triem

    2010-01-01

    Under NASA s New Millennium Program Space Technology 8 (ST 8) Project, four experiments Thermal Loop, Dependable Microprocessor, SAILMAST, and UltraFlex - were conducted to advance the maturity of individual technologies from proof of concept to prototype demonstration in a relevant environment , i.e. from a technology readiness level (TRL) of 3 to a level of 6. This paper presents the new technologies and validation approach of the Thermal Loop experiment. The Thermal Loop is an advanced thermal control system consisting of a miniature loop heat pipe (MLHP) with multiple evaporators and multiple condensers designed for future small system applications requiring low mass, low power, and compactness. The MLHP retains all features of state-of-the-art loop heat pipes (LHPs) and offers additional advantages to enhance the functionality, performance, versatility, and reliability of the system. Details of the thermal loop concept, technical advances, benefits, objectives, level 1 requirements, and performance characteristics are described. Also included in the paper are descriptions of the test articles and mathematical modeling used for the technology validation. An MLHP breadboard was built and tested in the laboratory and thermal vacuum environments for TRL 4 and TRL 5 validations, and an MLHP proto-flight unit was built and tested in a thermal vacuum chamber for the TRL 6 validation. In addition, an analytical model was developed to simulate the steady state and transient behaviors of the MLHP during various validation tests. Capabilities and limitations of the analytical model are also addressed.

  19. Early Detection of Increased Intracranial Pressure Episodes in Traumatic Brain Injury: External Validation in an Adult and in a Pediatric Cohort.

    PubMed

    Güiza, Fabian; Depreitere, Bart; Piper, Ian; Citerio, Giuseppe; Jorens, Philippe G; Maas, Andrew; Schuhmann, Martin U; Lo, Tsz-Yan Milly; Donald, Rob; Jones, Patricia; Maier, Gottlieb; Van den Berghe, Greet; Meyfroidt, Geert

    2017-03-01

    A model for early detection of episodes of increased intracranial pressure in traumatic brain injury patients has been previously developed and validated based on retrospective adult patient data from the multicenter Brain-IT database. The purpose of the present study is to validate this early detection model in different cohorts of recently treated adult and pediatric traumatic brain injury patients. Prognostic modeling. Noninterventional, observational, retrospective study. The adult validation cohort comprised recent traumatic brain injury patients from San Gerardo Hospital in Monza (n = 50), Leuven University Hospital (n = 26), Antwerp University Hospital (n = 19), Tübingen University Hospital (n = 18), and Southern General Hospital in Glasgow (n = 8). The pediatric validation cohort comprised patients from neurosurgical and intensive care centers in Edinburgh and Newcastle (n = 79). None. The model's performance was evaluated with respect to discrimination, calibration, overall performance, and clinical usefulness. In the recent adult validation cohort, the model retained excellent performance as in the original study. In the pediatric validation cohort, the model retained good discrimination and a positive net benefit, albeit with a performance drop in the remaining criteria. The obtained external validation results confirm the robustness of the model to predict future increased intracranial pressure events 30 minutes in advance, in adult and pediatric traumatic brain injury patients. These results are a large step toward an early warning system for increased intracranial pressure that can be generally applied. Furthermore, the sparseness of this model that uses only two routinely monitored signals as inputs (intracranial pressure and mean arterial blood pressure) is an additional asset.

  20. Novel Directional Protection Scheme for the FREEDM Smart Grid System

    NASA Astrophysics Data System (ADS)

    Sharma, Nitish

    This research primarily deals with the design and validation of the protection system for a large scale meshed distribution system. The large scale system simulation (LSSS) is a system level PSCAD model which is used to validate component models for different time-scale platforms, to provide a virtual testing platform for the Future Renewable Electric Energy Delivery and Management (FREEDM) system. It is also used to validate the cases of power system protection, renewable energy integration and storage, and load profiles. The protection of the FREEDM system against any abnormal condition is one of the important tasks. The addition of distributed generation and power electronic based solid state transformer adds to the complexity of the protection. The FREEDM loop system has a fault current limiter and in addition, the Solid State Transformer (SST) limits the fault current at 2.0 per unit. Former students at ASU have developed the protection scheme using fiber-optic cable. However, during the NSF-FREEDM site visit, the National Science Foundation (NSF) team regarded the system incompatible for the long distances. Hence, a new protection scheme with a wireless scheme is presented in this thesis. The use of wireless communication is extended to protect the large scale meshed distributed generation from any fault. The trip signal generated by the pilot protection system is used to trigger the FID (fault isolation device) which is an electronic circuit breaker operation (switched off/opening the FIDs). The trip signal must be received and accepted by the SST, and it must block the SST operation immediately. A comprehensive protection system for the large scale meshed distribution system has been developed in PSCAD with the ability to quickly detect the faults. The validation of the protection system is performed by building a hardware model using commercial relays at the ASU power laboratory.

  1. Face, content, and construct validity of human placenta as a haptic training tool in neurointerventional surgery.

    PubMed

    Ribeiro de Oliveira, Marcelo Magaldi; Nicolato, Arthur; Santos, Marcilea; Godinho, Joao Victor; Brito, Rafael; Alvarenga, Alexandre; Martins, Ana Luiza Valle; Prosdocimi, André; Trivelato, Felipe Padovani; Sabbagh, Abdulrahman J; Reis, Augusto Barbosa; Maestro, Rolando Del

    2016-05-01

    OBJECT The development of neurointerventional treatments of central nervous system disorders has resulted in the need for adequate training environments for novice interventionalists. Virtual simulators offer anatomical definition but lack adequate tactile feedback. Animal models, which provide more lifelike training, require an appropriate infrastructure base. The authors describe a training model for neurointerventional procedures using the human placenta (HP), which affords haptic training with significantly fewer resource requirements, and discuss its validation. METHODS Twelve HPs were prepared for simulated endovascular procedures. Training exercises performed by interventional neuroradiologists and novice fellows were placental angiography, stent placement, aneurysm coiling, and intravascular liquid embolic agent injection. RESULTS The endovascular training exercises proposed can be easily reproduced in the HP. Face, content, and construct validity were assessed by 6 neurointerventional radiologists and 6 novice fellows in interventional radiology. CONCLUSIONS The use of HP provides an inexpensive training model for the training of neurointerventionalists. Preliminary validation results show that this simulation model has face and content validity and has demonstrated construct validity for the interventions assessed in this study.

  2. Determining passive cooling limits in CPV using an analytical thermal model

    NASA Astrophysics Data System (ADS)

    Gualdi, Federico; Arenas, Osvaldo; Vossier, Alexis; Dollet, Alain; Aimez, Vincent; Arès, Richard

    2013-09-01

    We propose an original thermal analytical model aiming to predict the practical limits of passive cooling systems for high concentration photovoltaic modules. The analytical model is described and validated by comparison with a commercial 3D finite element model. The limiting performances of flat plate cooling systems in natural convection are then derived and discussed.

  3. Development, Validation, and Application of OSSEs at NASA/GMAO

    NASA Technical Reports Server (NTRS)

    Errico, Ronald; Prive, Nikki

    2015-01-01

    During the past several years, NASA Goddard's Global Modeling and Assimilation Office (GMAO) has been developing a framework for conducting Observing System Simulation Experiments (OSSEs). The motivation and design of that framework will be described and a sample of validation results presented. Fundamentals issues will be highlighted, particularly the critical importance of appropriately simulating system errors. Some problems that have just arisen in the newest experimental system will also be mentioned.

  4. Modelling of polymer photodegradation for solar cell modules

    NASA Technical Reports Server (NTRS)

    Somersall, A. C.; Guillet, J. E.

    1981-01-01

    A computer program developed to model and calculate by numerical integration the varying concentrations of chemical species formed during photooxidation of a polymeric material over time, using as input data a choice set of elementary reactions, corresponding rate constants and a convenient set of starting conditions is evaluated. Attempts were made to validate the proposed mechanism by experimentally monitoring the photooxidation products of small liquid alkane which are useful starting models for ethylene segments of polymers like EVA. The model system proved in appropriate for the intended purposes. Another validation model is recommended.

  5. Quantification of Dynamic Model Validation Metrics Using Uncertainty Propagation from Requirements

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Peck, Jeffrey A.; Stewart, Eric C.

    2018-01-01

    The Space Launch System, NASA's new large launch vehicle for long range space exploration, is presently in the final design and construction phases, with the first launch scheduled for 2019. A dynamic model of the system has been created and is critical for calculation of interface loads and natural frequencies and mode shapes for guidance, navigation, and control (GNC). Because of the program and schedule constraints, a single modal test of the SLS will be performed while bolted down to the Mobile Launch Pad just before the first launch. A Monte Carlo and optimization scheme will be performed to create thousands of possible models based on given dispersions in model properties and to determine which model best fits the natural frequencies and mode shapes from modal test. However, the question still remains as to whether this model is acceptable for the loads and GNC requirements. An uncertainty propagation and quantification (UP and UQ) technique to develop a quantitative set of validation metrics that is based on the flight requirements has therefore been developed and is discussed in this paper. There has been considerable research on UQ and UP and validation in the literature, but very little on propagating the uncertainties from requirements, so most validation metrics are "rules-of-thumb;" this research seeks to come up with more reason-based metrics. One of the main assumptions used to achieve this task is that the uncertainty in the modeling of the fixed boundary condition is accurate, so therefore that same uncertainty can be used in propagating the fixed-test configuration to the free-free actual configuration. The second main technique applied here is the usage of the limit-state formulation to quantify the final probabilistic parameters and to compare them with the requirements. These techniques are explored with a simple lumped spring-mass system and a simplified SLS model. When completed, it is anticipated that this requirements-based validation metric will provide a quantified confidence and probability of success for the final SLS dynamics model, which will be critical for a successful launch program, and can be applied in the many other industries where an accurate dynamic model is required.

  6. Validating the ACE Model for Evaluating Student Performance Using a Teaching-Learning Process Based on Computational Modeling Systems

    ERIC Educational Resources Information Center

    Louzada, Alexandre Neves; Elia, Marcos da Fonseca; Sampaio, Fábio Ferrentini; Vidal, Andre Luiz Pestana

    2014-01-01

    The aim of this work is to adapt and test, in a Brazilian public school, the ACE model proposed by Borkulo for evaluating student performance as a teaching-learning process based on computational modeling systems. The ACE model is based on different types of reasoning involving three dimensions. In addition to adapting the model and introducing…

  7. Modelling and simulation of a heat exchanger

    NASA Technical Reports Server (NTRS)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  8. The Error Prone Model and the Basic Grants Validation Selection System. Draft Final Report.

    ERIC Educational Resources Information Center

    System Development Corp., Falls Church, VA.

    An evaluation of existing and proposed mechanisms to ensure data accuracy for the Pell Grant program is reported, and recommendations for efficient detection of fraud and error in the program are offered. One study objective was to examine the existing system of pre-established criteria (PEC), which are validation criteria that select students on…

  9. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  10. Utilisation of real-scale renewable energy test facility for validation of generic wind turbine and wind power plant controller models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeni, Lorenzo; Hesselbæk, Bo; Bech, John

    This article presents an example of application of a modern test facility conceived for experiments regarding the integration of renewable energy in the power system. The capabilities of the test facility are used to validate dynamic simulation models of wind power plants and their controllers. The models are based on standard and generic blocks. The successful validation of events related to the control of active power (control phenomena in <10 Hz range, including frequency control and power oscillation damping) is described, demonstrating the capabilities of the test facility and drawing the track for future work and improvements.

  11. Sewer solids separation by sedimentation--the problem of modeling, validation and transferability.

    PubMed

    Kutzner, R; Brombach, H; Geiger, W F

    2007-01-01

    Sedimentation of sewer solids in tanks, ponds and similar devices is the most relevant process for the treatment of stormwater and combined sewer overflows in urban collecting systems. In the past a lot of research work was done to develop deterministic models for the description of this separation process. But these modern models are not commonly accepted in Germany until today. Water Authorities are sceptical with regard to model validation and transferability. Within this paper it is checked whether this scepticism is reasonable. A framework-proposal for the validation of mathematical models with zero or one dimensional spatial resolution for particle separation processes for stormwater and combined sewer overflow treatment is presented. This proposal was applied to publications of repute on sewer solids separation by sedimentation. The result was that none of the investigated models described in literature passed the validation entirely. There is an urgent need for future research in sewer solids sedimentation and remobilization!

  12. Reinforced Carbon-Carbon Subcomponent Flat Plate Impact Testing for Space Shuttle Orbiter Return to Flight

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.; Brand, Jeremy H.; Pereira, J. Michael; Revilock, Duane M.

    2007-01-01

    Following the tragedy of the Space Shuttle Columbia on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the Space Shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize Reinforced Carbon-Carbon (RCC) and various debris materials which could potentially shed on ascent and impact the Orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS DYNA to predict damage by potential and actual impact events on the Orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: fundamental tests to obtain independent static and dynamic material model properties of materials of interest, sub-component impact tests to provide highly controlled impact test data for the correlation and validation of the models, and full-scale impact tests to establish the final level of confidence for the analysis methodology. This paper discusses the second level subcomponent test program in detail and its application to the LS DYNA model validation process. The level two testing consisted of over one hundred impact tests in the NASA Glenn Research Center Ballistic Impact Lab on 6 by 6 in. and 6 by 12 in. flat plates of RCC and evaluated three types of debris projectiles: BX 265 External Tank foam, ice, and PDL 1034 External Tank foam. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile. The information obtained from this testing validated the LS DYNA damage prediction models and provided a certain level of confidence to begin performing analysis for full-size RCC test articles for returning NASA to flight with STS 114 and beyond.

  13. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    NASA Technical Reports Server (NTRS)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  14. An assessment of a North American Multi-Model Ensemble (NMME) based global drought early warning forecast system

    NASA Astrophysics Data System (ADS)

    Wood, E. F.; Yuan, X.; Sheffield, J.; Pan, M.; Roundy, J.

    2013-12-01

    One of the key recommendations of the WCRP Global Drought Information System (GDIS) workshop is to develop an experimental real-time global monitoring and prediction system. While great advances has been made in global drought monitoring based on satellite observations and model reanalysis data, global drought forecasting has been stranded in part due to the limited skill both in climate forecast models and global hydrologic predictions. Having been working on drought monitoring and forecasting over USA for more than a decade, the Princeton land surface hydrology group is now developing an experimental global drought early warning system that is based on multiple climate forecast models and a calibrated global hydrologic model. In this presentation, we will test its capability in seasonal forecasting of meteorological, agricultural and hydrologic droughts over global major river basins, using precipitation, soil moisture and streamflow forecasts respectively. Based on the joint probability distribution between observations using Princeton's global drought monitoring system and model hindcasts and real-time forecasts from North American Multi-Model Ensemble (NMME) project, we (i) bias correct the monthly precipitation and temperature forecasts from multiple climate forecast models, (ii) downscale them to a daily time scale, and (iii) use them to drive the calibrated VIC model to produce global drought forecasts at a 1-degree resolution. A parallel run using the ESP forecast method, which is based on resampling historical forcings, is also carried out for comparison. Analysis is being conducted over global major river basins, with multiple drought indices that have different time scales and characteristics. The meteorological drought forecast does not have uncertainty from hydrologic models and can be validated directly against observations - making the validation an 'apples-to-apples' comparison. Preliminary results for the evaluation of meteorological drought onset hindcasts indicate that climate models increase drought detectability over ESP by 31%-81%. However, less than 30% of the global drought onsets can be detected by climate models. The missed drought events are associated with weak ENSO signals and lower potential predictability. Due to the high false alarms from climate models, the reliability is more important than sharpness for a skillful probabilistic drought onset forecast. Validations and skill assessments for agricultural and hydrologic drought forecasts are carried out using soil moisture and streamflow output from the VIC land surface model (LSM) forced by a global forcing data set. Given our previous drought forecasting experiences over USA and Africa, validating the hydrologic drought forecasting is a significant challenge for a global drought early warning system.

  15. Integrated multiscale biomaterials experiment and modelling: a perspective

    PubMed Central

    Buehler, Markus J.; Genin, Guy M.

    2016-01-01

    Advances in multiscale models and computational power have enabled a broad toolset to predict how molecules, cells, tissues and organs behave and develop. A key theme in biological systems is the emergence of macroscale behaviour from collective behaviours across a range of length and timescales, and a key element of these models is therefore hierarchical simulation. However, this predictive capacity has far outstripped our ability to validate predictions experimentally, particularly when multiple hierarchical levels are involved. The state of the art represents careful integration of multiscale experiment and modelling, and yields not only validation, but also insights into deformation and relaxation mechanisms across scales. We present here a sampling of key results that highlight both challenges and opportunities for integrated multiscale experiment and modelling in biological systems. PMID:28981126

  16. [Nursing on the Web: the creation and validation process of a web site on coronary artery disease].

    PubMed

    Marques, Isaac Rosa; Marin, Heimar de Fátima

    2002-01-01

    The World Wide Web is an important health information research source. A challenge for the Brazilian Nursing Informatics area is to use its potential to promote health education. This paper aims to present a developing and validating model used in an educational Web site, named CardioSite, which subject is Coronary Heart Disease. In its creation it was adopted a method with phases of conceptual modeling, development, implementation, and evaluation. In the evaluation phase, the validation was performed through an online informatics and health experts panel. The results demonstrated that information was reliable and valid. Considering that national official systems are not available to that approach, this model demonstrated effectiveness in assessing the quality of the Web site content.

  17. A Supervised Learning Process to Validate Online Disease Reports for Use in Predictive Models.

    PubMed

    Patching, Helena M M; Hudson, Laurence M; Cooke, Warrick; Garcia, Andres J; Hay, Simon I; Roberts, Mark; Moyes, Catherine L

    2015-12-01

    Pathogen distribution models that predict spatial variation in disease occurrence require data from a large number of geographic locations to generate disease risk maps. Traditionally, this process has used data from public health reporting systems; however, using online reports of new infections could speed up the process dramatically. Data from both public health systems and online sources must be validated before they can be used, but no mechanisms exist to validate data from online media reports. We have developed a supervised learning process to validate geolocated disease outbreak data in a timely manner. The process uses three input features, the data source and two metrics derived from the location of each disease occurrence. The location of disease occurrence provides information on the probability of disease occurrence at that location based on environmental and socioeconomic factors and the distance within or outside the current known disease extent. The process also uses validation scores, generated by disease experts who review a subset of the data, to build a training data set. The aim of the supervised learning process is to generate validation scores that can be used as weights going into the pathogen distribution model. After analyzing the three input features and testing the performance of alternative processes, we selected a cascade of ensembles comprising logistic regressors. Parameter values for the training data subset size, number of predictors, and number of layers in the cascade were tested before the process was deployed. The final configuration was tested using data for two contrasting diseases (dengue and cholera), and 66%-79% of data points were assigned a validation score. The remaining data points are scored by the experts, and the results inform the training data set for the next set of predictors, as well as going to the pathogen distribution model. The new supervised learning process has been implemented within our live site and is being used to validate the data that our system uses to produce updated predictive disease maps on a weekly basis.

  18. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  19. Longitudinal train dynamics model for a rail transit simulation system

    DOE PAGES

    Wang, Jinghui; Rakha, Hesham A.

    2018-01-01

    The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less

  20. Longitudinal train dynamics model for a rail transit simulation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jinghui; Rakha, Hesham A.

    The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less

  1. Force Project Technology Presentation to the NRCC

    DTIC Science & Technology

    2014-02-04

    Functional Bridge components Smart Odometer Adv Pretreatment Smart Bridge Multi-functional Gap Crossing Fuel Automated Tracking System Adv...comprehensive matrix of candidate composite material systems and textile reinforcement architectures via modeling/analyses and testing. Product(s...Validated Dynamic Modeling tool based on parametric study using material models to reliably predict the textile mechanics of the hose

  2. Can MODIS Data Calibrate and Validate Coastal Sediment Transport Models? Rapid Prototyping Using 250 m Data and the ECOMSED Model for Lake Pontchartrain, LA USA

    NASA Technical Reports Server (NTRS)

    Miller, Richard L.; Georgiou, Ioannis; Glorioso, Mark V.; McCorquodale, J. Alex; Crowder, Keely

    2006-01-01

    Field measurements from small boats and sparse arrays of instrumented buoys often do not provide sufficient data to capture the dynamic nature of biogeophysical parameters in may coastal aquatic environments. Several investigators have shown the MODIS 250 m images can provide daily synoptic views of suspended sediment concentration in coastal waters to determine sediment transport and fate. However, the use of MODIS for coastal environments can be limited due to a lack of cloud-free images. Sediment transport models are not constrained by sky conditions but often suffer from a lack of in situ observations for model calibration or validation. We demonstrate here the utility of MODIS 250 m to calibrate (set model parameters), validate output, and set or reset initial conditions of a hydrodynamic and sediment transport model (ECOMSED) developed for Lake Pontchartrain, LA USA. We present our approach in the context of how to quickly assess of 'prototype' an application of NASA data to support environmental managers and decision makers. The combination of daily MODIS imagery and model simulations offer a more robust monitoring and prediction system of suspended sediments than available from either system alone.

  3. Mining for Data

    NASA Technical Reports Server (NTRS)

    1998-01-01

    AbTech Corporation used an F-18 HARV (High Alpha Research Vehicle) simulation developed by NASA to create an interactive computer-based prototype of the MQ (Model Quest) SV (System Validator) tool. Dryden Flight Research Center provided support to develop, test, and rapidly reprogram the validation function. AbTech's ModelQuest Enterprises highly automated and outperforms other modeling techniques to quickly discover meaningful relationships, patterns, and trends in databases. Applications include technical and business professionals in finance, marketing, business, banking, retail, healthcare, and aerospace.

  4. An Overview of Prognosis Health Management Research at Glenn Research Center for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  5. An Overview of Prognosis Health Management Research at GRC for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  6. Development and Validation of High Precision Thermal, Mechanical, and Optical Models for the Space Interferometry Mission

    NASA Technical Reports Server (NTRS)

    Lindensmith, Chris A.; Briggs, H. Clark; Beregovski, Yuri; Feria, V. Alfonso; Goullioud, Renaud; Gursel, Yekta; Hahn, Inseob; Kinsella, Gary; Orzewalla, Matthew; Phillips, Charles

    2006-01-01

    SIM Planetquest (SIM) is a large optical interferometer for making microarcsecond measurements of the positions of stars, and to detect Earth-sized planets around nearby stars. To achieve this precision, SIM requires stability of optical components to tens of picometers per hour. The combination of SIM s large size (9 meter baseline) and the high stability requirement makes it difficult and costly to measure all aspects of system performance on the ground. To reduce risks, costs and to allow for a design with fewer intermediate testing stages, the SIM project is developing an integrated thermal, mechanical and optical modeling process that will allow predictions of the system performance to be made at the required high precision. This modeling process uses commercial, off-the-shelf tools and has been validated against experimental results at the precision of the SIM performance requirements. This paper presents the description of the model development, some of the models, and their validation in the Thermo-Opto-Mechanical (TOM3) testbed which includes full scale brassboard optical components and the metrology to test them at the SIM performance requirement levels.

  7. Validation of the 'full reconnection model' of the sawtooth instability in KSTAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nam, Y. B.; Ko, J. S.; Choe, G. H.

    In this paper, the central safety factor (q 0) during sawtooth oscillation has been measured with a great accuracy with the motional Stark effect (MSE) system on KSTAR and the measured value was However, this measurement alone cannot validate the disputed full and partial reconnection models definitively due to non-trivial off-set error (~0.05). Supplemental experiment of the excited m = 2, m = 3 modes that are extremely sensitive to the background q 0 and core magnetic shear definitively validates the 'full reconnection model'. The radial position of the excited modes right after the crash and time evolution into themore » 1/1 kink mode before the crash in a sawtoothing plasma suggests that in the MHD quiescent period after the crash and before the crash. Finally, additional measurement of the long lived m = 3, m = 5 modes in a non-sawtoothing discharge (presumably ) further validates the 'full reconnection model'.« less

  8. Validation of the 'full reconnection model' of the sawtooth instability in KSTAR

    DOE PAGES

    Nam, Y. B.; Ko, J. S.; Choe, G. H.; ...

    2018-03-26

    In this paper, the central safety factor (q 0) during sawtooth oscillation has been measured with a great accuracy with the motional Stark effect (MSE) system on KSTAR and the measured value was However, this measurement alone cannot validate the disputed full and partial reconnection models definitively due to non-trivial off-set error (~0.05). Supplemental experiment of the excited m = 2, m = 3 modes that are extremely sensitive to the background q 0 and core magnetic shear definitively validates the 'full reconnection model'. The radial position of the excited modes right after the crash and time evolution into themore » 1/1 kink mode before the crash in a sawtoothing plasma suggests that in the MHD quiescent period after the crash and before the crash. Finally, additional measurement of the long lived m = 3, m = 5 modes in a non-sawtoothing discharge (presumably ) further validates the 'full reconnection model'.« less

  9. Validation of a "Kane's Dynamics" Model for the Active Rack Isolation System

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey S.; Hampton, R. David

    2000-01-01

    Many microgravity space-science experiments require vibratory acceleration levels unachievable without active isolation. The Boeing Corporation's Active Rack Isolation System (ARIS) employs a novel combination of magnetic actuation and mechanical linkages, to address these isolation requirements on the International Space Station (ISS). ARIS provides isolation at the rack (international Standard Payload Rack, or ISPR) level. Effective model-based vibration isolation requires (1) an isolation device, (2) an adequate dynamic (i.e., mathematical) model of that isolator, and (3) a suitable, corresponding controller, ARIS provides the ISS response to the first requirement. In November 1999, the authors presented a response to the second ("A 'Kane's Dynamics' model for the Active Rack Isolation System", Hampton and Beech) intended to facilitate an optimal-controls approach to the third. This paper documents the validation of that high-fidelity dynamic model of ARIS. As before, this model contains the full actuator dynamics, however, the umbilical models are not included in this presentation. The validation of this dynamics model was achieved by utilizing two Commercial Off the Shelf (COTS) software tools: Deneb's ENVISION, and Online Dynamics' AUTOLEV. ENVISION is a robotics software package developed for the automotive industry that employs 3-dimensional (3-D) Computer Aided Design (CAD) models to facilitate both forward and inverse kinematics analyses. AUTOLEV is a DOS based interpreter that is designed in general to solve vector based mathematical problems and specifically to solve Dynamics problems using Kane's method.

  10. Water quality modeling in the systems impact assessment model for the Klamath River basin - Keno, Oregon to Seiad Valley, California

    USGS Publications Warehouse

    Hanna, R. Blair; Campbell, Sharon G.

    2000-01-01

    This report describes the water quality model developed for the Klamath River System Impact Assessment Model (SIAM). The Klamath River SIAM is a decision support system developed by the authors and other US Geological Survey (USGS), Midcontinent Ecological Science Center staff to study the effects of basin-wide water management decisions on anadromous fish in the Klamath River. The Army Corps of Engineersa?? HEC5Q water quality modeling software was used to simulate water temperature, dissolved oxygen and conductivity in 100 miles of the Klamath River Basin in Oregon and California. The water quality model simulated three reservoirs and the mainstem Klamath River influenced by the Shasta and Scott River tributaries. Model development, calibration and two validation exercises are described as well as the integration of the water quality model into the SIAM decision support system software. Within SIAM, data are exchanged between the water quantity model (MODSIM), the water quality model (HEC5Q), the salmon population model (SALMOD) and methods for evaluating ecosystem health. The overall predictive ability of the water quality model is described in the context of calibration and validation error statistics. Applications of SIAM and the water quality model are described.

  11. Recommended Research Directions for Improving the Validation of Complex Systems Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vugrin, Eric D.; Trucano, Timothy G.; Swiler, Laura Painton

    Improved validation for models of complex systems has been a primary focus over the past year for the Resilience in Complex Systems Research Challenge. This document describes a set of research directions that are the result of distilling those ideas into three categories of research -- epistemic uncertainty, strong tests, and value of information. The content of this document can be used to transmit valuable information to future research activities, update the Resilience in Complex Systems Research Challenge's roadmap, inform the upcoming FY18 Laboratory Directed Research and Development (LDRD) call and research proposals, and facilitate collaborations between Sandia and externalmore » organizations. The recommended research directions can provide topics for collaborative research, development of proposals, workshops, and other opportunities.« less

  12. Development and Validation of a Slurry Model for Chemical Hydrogen Storage in Fuel Cell Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, Kriston P.; Pires, Richard P.; Simmons, Kevin L.

    2014-07-25

    The US Department of Energy's (DOE) Hydrogen Storage Engineering Center of Excellence (HSECoE) is developing models for hydrogen storage systems for fuel cell-based light duty vehicle applications for a variety of promising materials. These transient models simulate the performance of the storage system for comparison to the DOE’s Technical Targets and a set of four drive cycles. The purpose of this research is to describe the models developed for slurry-based chemical hydrogen storage materials. The storage systems of both a representative exothermic system based on ammonia borane and endothermic system based on alane were developed and modeled in Simulink®. Oncemore » complete the reactor and radiator components of the model were validated with experimental data. The model was then run using a highway cycle, an aggressive cycle, cold-start cycle and hot drive cycle. The system design was adjusted to meet these drive cycles. A sensitivity analysis was then performed to identify the range of material properties where these DOE targets and drive cycles could be met. Materials with a heat of reaction greater than 11 kJ/mol H2 generated and a slurry hydrogen capacity of greater than 11.4% will meet the on-board efficiency and gravimetric capacity targets, respectively.« less

  13. Validation (not just verification) of Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Duren, Riley M.

    2006-01-01

    ion & Validation (V&V) is a widely recognized and critical systems engineering function. However, the often used definition 'Verification proves the design is right; validation proves it is the right design' is rather vague. And while Verification is a reasonably well standardized systems engineering process, Validation is a far more abstract concept and the rigor and scope applied to it varies widely between organizations and individuals. This is reflected in the findings in recent Mishap Reports for several NASA missions, in which shortfalls in Validation (not just Verification) were cited as root- or contributing-factors in catastrophic mission loss. Furthermore, although there is strong agreement in the community that Test is the preferred method for V&V, many people equate 'V&V' with 'Test', such that Analysis and Modeling aren't given comparable attention. Another strong motivator is a realization that the rapid growth in complexity of deep-space missions (particularly Planetary Landers and Space Observatories given their inherent unknowns) is placing greater demands on systems engineers to 'get it right' with Validation.

  14. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry T; Yin, Fang-Fang; Chetty, Indrin J

    2013-04-21

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan-scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the 'ISource = 8: Phase-Space Source Incident from Multiple Directions' in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  15. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation

    NASA Astrophysics Data System (ADS)

    Kim, Sangroh; Yoshizumi, Terry T.; Yin, Fang-Fang; Chetty, Indrin J.

    2013-04-01

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan—scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the ‘ISource = 8: Phase-Space Source Incident from Multiple Directions’ in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral CT scan dose in the BEAMnrc/EGSnrc system.

  16. Control Oriented Modeling and Validation of Aeroservoelastic Systems

    NASA Technical Reports Server (NTRS)

    Crowder, Marianne; deCallafon, Raymond (Principal Investigator)

    2002-01-01

    Lightweight aircraft design emphasizes the reduction of structural weight to maximize aircraft efficiency and agility at the cost of increasing the likelihood of structural dynamic instabilities. To ensure flight safety, extensive flight testing and active structural servo control strategies are required to explore and expand the boundary of the flight envelope. Aeroservoelastic (ASE) models can provide online flight monitoring of dynamic instabilities to reduce flight time testing and increase flight safety. The success of ASE models is determined by the ability to take into account varying flight conditions and the possibility to perform flight monitoring under the presence of active structural servo control strategies. In this continued study, these aspects are addressed by developing specific methodologies and algorithms for control relevant robust identification and model validation of aeroservoelastic structures. The closed-loop model robust identification and model validation are based on a fractional model approach where the model uncertainties are characterized in a closed-loop relevant way.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhinefrank, Kenneth E.; Lenee-Bluhm, Pukha; Prudell, Joseph H.

    The most prudent path to a full-scale design, build and deployment of a wave energy conversion (WEC) system involves establishment of validated numerical models using physical experiments in a methodical scaling program. This Project provides essential additional rounds of wave tank testing at 1:33 scale and ocean/bay testing at a 1:7 scale, necessary to validate numerical modeling that is essential to a utility-scale WEC design and associated certification.

  18. Simulation of Wave-Current Interaction Using a Three-Dimensional Hydrodynamic Model Coupled With a Phase Averaged Wave Model

    NASA Astrophysics Data System (ADS)

    Marsooli, R.; Orton, P. M.; Georgas, N.; Blumberg, A. F.

    2016-02-01

    The Stevens Institute of Technology Estuarine and Coastal Ocean Model (sECOM) has been coupled with a more advanced surface wave model to simulate wave‒current interaction, and results have been validated in estuarine and nearshore waters. sECOM is a three‒dimensional, hydrostatic, free surface, primitive equation model. It solves the Navier‒Stokes equations and the conservation equations for temperature and salinity using a finite‒difference method on an Arakawa C‒grid with a terrain‒following (sigma) vertical coordinate and orthogonal curvilinear horizontal coordinate system. The model is coupled with the surface wave model developed by Mellor et al. (2008), which solves the spectral equation and takes into account depth and current refraction, and deep and shallow water. The wave model parameterizes the energy distribution in frequency space and the wave‒wave interaction process by using a specified spectrum shape. The coupled wave‒hydrodynamic model considers the wave‒current interaction through wave‒induced bottom stress, depth‒dependent radiation stress, and wave effects on wind‒induced surface stress. The model is validated using the data collected at a natural sandy beach at Duck, North Carolina, during the DUCK94 experiment. This test case reveals the capability of the model to simulate the wave‒current interaction in nearshore coastal systems. The model is further validated using the data collected in Jamaica Bay, a semi‒enclosed body of water located in New York City region. This test reveals the applicability of the model to estuarine systems. These validations of the model and comparisons to its prior wave model, the Great Lakes Environmental Research Laboratory (GLERL) wave model (Donelan 1977), are presented and discussed. ReferencesG.L. Mellor, M.A. Donelan, and L‒Y. Oey, 2008, A Surface Wave Model for Coupling with Numerical Ocean Circulation Models. J. Atmos. Oceanic Technol., 25, 1785‒1807.Donelan, M. A 1977. A simple numerical model for wave and wind stress application. Report, National Water Research Institute, Burlington, Ontario, Canada, 28 pp.

  19. Ensemble assimilation of ARGO temperature profile, sea surface temperature and Altimetric satellite data into an eddy permitting primitive equation model of the North Atlantic ocean

    NASA Astrophysics Data System (ADS)

    Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre

    2015-04-01

    Sea surface height, sea surface temperature and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. 60 ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. Incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with observations used in the assimilation experiments and independent observations, which goes further than most previous studies and constitutes one of the original points of this paper. Regarding the deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations in order to diagnose the ensemble distribution properties in a deterministic way. Regarding the probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analysed jointly. The consistency and complementarity between both validations are highlighted. High reliable situations, in which the RMS error and the CRPS give the same information, are identified for the first time in this paper.

  20. Analysis of Aurora's Performance Simulation Engine for Three Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systemsmore » in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.« less

  1. Development and Validation of Decision Forest Model for Estrogen Receptor Binding Prediction of Chemicals Using Large Data Sets.

    PubMed

    Ng, Hui Wen; Doughty, Stephen W; Luo, Heng; Ye, Hao; Ge, Weigong; Tong, Weida; Hong, Huixiao

    2015-12-21

    Some chemicals in the environment possess the potential to interact with the endocrine system in the human body. Multiple receptors are involved in the endocrine system; estrogen receptor α (ERα) plays very important roles in endocrine activity and is the most studied receptor. Understanding and predicting estrogenic activity of chemicals facilitates the evaluation of their endocrine activity. Hence, we have developed a decision forest classification model to predict chemical binding to ERα using a large training data set of 3308 chemicals obtained from the U.S. Food and Drug Administration's Estrogenic Activity Database. We tested the model using cross validations and external data sets of 1641 chemicals obtained from the U.S. Environmental Protection Agency's ToxCast project. The model showed good performance in both internal (92% accuracy) and external validations (∼ 70-89% relative balanced accuracies), where the latter involved the validations of the model across different ER pathway-related assays in ToxCast. The important features that contribute to the prediction ability of the model were identified through informative descriptor analysis and were related to current knowledge of ER binding. Prediction confidence analysis revealed that the model had both high prediction confidence and accuracy for most predicted chemicals. The results demonstrated that the model constructed based on the large training data set is more accurate and robust for predicting ER binding of chemicals than the published models that have been developed using much smaller data sets. The model could be useful for the evaluation of ERα-mediated endocrine activity potential of environmental chemicals.

  2. A Metric-Based Validation Process to Assess the Realism of Synthetic Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birchfield, Adam; Schweitzer, Eran; Athari, Mir

    Public power system test cases that are of high quality benefit the power systems research community with expanded resources for testing, demonstrating, and cross-validating new innovations. Building synthetic grid models for this purpose is a relatively new problem, for which a challenge is to show that created cases are sufficiently realistic. This paper puts forth a validation process based on a set of metrics observed from actual power system cases. These metrics follow the structure, proportions, and parameters of key power system elements, which can be used in assessing and validating the quality of synthetic power grids. Though wide diversitymore » exists in the characteristics of power systems, the paper focuses on an initial set of common quantitative metrics to capture the distribution of typical values from real power systems. The process is applied to two new public test cases, which are shown to meet the criteria specified in the metrics of this paper.« less

  3. A Metric-Based Validation Process to Assess the Realism of Synthetic Power Grids

    DOE PAGES

    Birchfield, Adam; Schweitzer, Eran; Athari, Mir; ...

    2017-08-19

    Public power system test cases that are of high quality benefit the power systems research community with expanded resources for testing, demonstrating, and cross-validating new innovations. Building synthetic grid models for this purpose is a relatively new problem, for which a challenge is to show that created cases are sufficiently realistic. This paper puts forth a validation process based on a set of metrics observed from actual power system cases. These metrics follow the structure, proportions, and parameters of key power system elements, which can be used in assessing and validating the quality of synthetic power grids. Though wide diversitymore » exists in the characteristics of power systems, the paper focuses on an initial set of common quantitative metrics to capture the distribution of typical values from real power systems. The process is applied to two new public test cases, which are shown to meet the criteria specified in the metrics of this paper.« less

  4. Development and Validation of an NPSS Model of a Small Turbojet Engine

    NASA Astrophysics Data System (ADS)

    Vannoy, Stephen Michael

    Recent studies have shown that integrated gas turbine engine (GT)/solid oxide fuel cell (SOFC) systems for combined propulsion and power on aircraft offer a promising method for more efficient onboard electrical power generation. However, it appears that nobody has actually attempted to construct a hybrid GT/SOFC prototype for combined propulsion and electrical power generation. This thesis contributes to this ambition by developing an experimentally validated thermodynamic model of a small gas turbine (˜230 N thrust) platform for a bench-scale GT/SOFC system. The thermodynamic model is implemented in a NASA-developed software environment called Numerical Propulsion System Simulation (NPSS). An indoor test facility was constructed to measure the engine's performance parameters: thrust, air flow rate, fuel flow rate, engine speed (RPM), and all axial stage stagnation temperatures and pressures. The NPSS model predictions are compared to the measured performance parameters for steady state engine operation.

  5. GPS Auto-Navigation Design for Unmanned Air Vehicles

    NASA Technical Reports Server (NTRS)

    Nilsson, Caroline C. A.; Heinzen, Stearns N.; Hall, Charles E., Jr.; Chokani, Ndaona

    2003-01-01

    A GPS auto-navigation system is designed for Unmanned Air Vehicles. The objective is to enable the air vehicle to be used as a test-bed for novel flow control concepts. The navigation system uses pre-programmed GPS waypoints. The actual GPS position, heading, and velocity are collected by the flight computer, a PC104 system running in Real-Time Linux, and compared with the desired waypoint. The navigator then determines the necessity of a heading correction and outputs the correction in the form of a commanded bank angle, for a level coordinated turn, to the controller system. This controller system consists of 5 controller! (pitch rate PID, yaw damper, bank angle PID, velocity hold, and altitude hold) designed for a closed loop non-linear aircraft model with linear aerodynamic coefficients. The ability and accuracy of using GPS data, is validated by a GPS flight. The autopilots are also validated in flight. The autopilot unit flight validations show that the designed autopilots function as designed. The aircraft model, generated on Matlab SIMULINK is also enhanced by the flight data to accurately represent the actual aircraft.

  6. An Overview of NASA's IM&S Verification and Validation Process Plan and Specification for Space Exploration

    NASA Technical Reports Server (NTRS)

    Gravitz, Robert M.; Hale, Joseph

    2006-01-01

    NASA's Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model's fidelity, credibility, and quality. This information will allow the decision-maker to understand the risks involved in using a model's results in the decision-making process. This presentation will discuss NASA's approach for verification and validation (V&V) of its models or simulations supporting space exploration. This presentation will describe NASA's V&V process and the associated M&S verification and validation (V&V) activities required to support the decision-making process. The M&S V&V Plan and V&V Report templates for ESMD will also be illustrated.

  7. Distributed Trust Management for Validating SLA Choreographies

    NASA Astrophysics Data System (ADS)

    Haq, Irfan Ul; Alnemr, Rehab; Paschke, Adrian; Schikuta, Erich; Boley, Harold; Meinel, Christoph

    For business workflow automation in a service-enriched environment such as a grid or a cloud, services scattered across heterogeneous Virtual Organizations (VOs) can be aggregated in a producer-consumer manner, building hierarchical structures of added value. In order to preserve the supply chain, the Service Level Agreements (SLAs) corresponding to the underlying choreography of services should also be incrementally aggregated. This cross-VO hierarchical SLA aggregation requires validation, for which a distributed trust system becomes a prerequisite. Elaborating our previous work on rule-based SLA validation, we propose a hybrid distributed trust model. This new model is based on Public Key Infrastructure (PKI) and reputation-based trust systems. It helps preventing SLA violations by identifying violation-prone services at service selection stage and actively contributes in breach management at the time of penalty enforcement.

  8. Simultaneous Observation of Hybrid States for Cyber-Physical Systems: A Case Study of Electric Vehicle Powertrain.

    PubMed

    Lv, Chen; Liu, Yahui; Hu, Xiaosong; Guo, Hongyan; Cao, Dongpu; Wang, Fei-Yue

    2017-08-22

    As a typical cyber-physical system (CPS), electrified vehicle becomes a hot research topic due to its high efficiency and low emissions. In order to develop advanced electric powertrains, accurate estimations of the unmeasurable hybrid states, including discrete backlash nonlinearity and continuous half-shaft torque, are of great importance. In this paper, a novel estimation algorithm for simultaneously identifying the backlash position and half-shaft torque of an electric powertrain is proposed using a hybrid system approach. System models, including the electric powertrain and vehicle dynamics models, are established considering the drivetrain backlash and flexibility, and also calibrated and validated using vehicle road testing data. Based on the developed system models, the powertrain behavior is represented using hybrid automata according to the piecewise affine property of the backlash dynamics. A hybrid-state observer, which is comprised of a discrete-state observer and a continuous-state observer, is designed for the simultaneous estimation of the backlash position and half-shaft torque. In order to guarantee the stability and reachability, the convergence property of the proposed observer is investigated. The proposed observer are validated under highly dynamical transitions of vehicle states. The validation results demonstrates the feasibility and effectiveness of the proposed hybrid-state observer.

  9. Predicting Pilot Behavior in Medium Scale Scenarios Using Game Theory and Reinforcement Learning

    NASA Technical Reports Server (NTRS)

    Yildiz, Yildiray; Agogino, Adrian; Brat, Guillaume

    2013-01-01

    Effective automation is critical in achieving the capacity and safety goals of the Next Generation Air Traffic System. Unfortunately creating integration and validation tools for such automation is difficult as the interactions between automation and their human counterparts is complex and unpredictable. This validation becomes even more difficult as we integrate wide-reaching technologies that affect the behavior of different decision makers in the system such as pilots, controllers and airlines. While overt short-term behavior changes can be explicitly modeled with traditional agent modeling systems, subtle behavior changes caused by the integration of new technologies may snowball into larger problems and be very hard to detect. To overcome these obstacles, we show how integration of new technologies can be validated by learning behavior models based on goals. In this framework, human participants are not modeled explicitly. Instead, their goals are modeled and through reinforcement learning their actions are predicted. The main advantage to this approach is that modeling is done within the context of the entire system allowing for accurate modeling of all participants as they interact as a whole. In addition such an approach allows for efficient trade studies and feasibility testing on a wide range of automation scenarios. The goal of this paper is to test that such an approach is feasible. To do this we implement this approach using a simple discrete-state learning system on a scenario where 50 aircraft need to self-navigate using Automatic Dependent Surveillance-Broadcast (ADS-B) information. In this scenario, we show how the approach can be used to predict the ability of pilots to adequately balance aircraft separation and fly efficient paths. We present results with several levels of complexity and airspace congestion.

  10. Formal Specification and Design Techniques for Wireless Sensor and Actuator Networks

    PubMed Central

    Martínez, Diego; González, Apolinar; Blanes, Francisco; Aquino, Raúl; Simo, José; Crespo, Alfons

    2011-01-01

    A current trend in the development and implementation of industrial applications is to use wireless networks to communicate the system nodes, mainly to increase application flexibility, reliability and portability, as well as to reduce the implementation cost. However, the nondeterministic and concurrent behavior of distributed systems makes their analysis and design complex, often resulting in less than satisfactory performance in simulation and test bed scenarios, which is caused by using imprecise models to analyze, validate and design these systems. Moreover, there are some simulation platforms that do not support these models. This paper presents a design and validation method for Wireless Sensor and Actuator Networks (WSAN) which is supported on a minimal set of wireless components represented in Colored Petri Nets (CPN). In summary, the model presented allows users to verify the design properties and structural behavior of the system. PMID:22344203

  11. Graph-based real-time fault diagnostics

    NASA Technical Reports Server (NTRS)

    Padalkar, S.; Karsai, G.; Sztipanovits, J.

    1988-01-01

    A real-time fault detection and diagnosis capability is absolutely crucial in the design of large-scale space systems. Some of the existing AI-based fault diagnostic techniques like expert systems and qualitative modelling are frequently ill-suited for this purpose. Expert systems are often inadequately structured, difficult to validate and suffer from knowledge acquisition bottlenecks. Qualitative modelling techniques sometimes generate a large number of failure source alternatives, thus hampering speedy diagnosis. In this paper we present a graph-based technique which is well suited for real-time fault diagnosis, structured knowledge representation and acquisition and testing and validation. A Hierarchical Fault Model of the system to be diagnosed is developed. At each level of hierarchy, there exist fault propagation digraphs denoting causal relations between failure modes of subsystems. The edges of such a digraph are weighted with fault propagation time intervals. Efficient and restartable graph algorithms are used for on-line speedy identification of failure source components.

  12. Multiphysics Simulation of Welding-Arc and Nozzle-Arc System: Mathematical-Model, Solution-Methodology and Validation

    NASA Astrophysics Data System (ADS)

    Pawar, Sumedh; Sharma, Atul

    2018-01-01

    This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.

  13. NARMAX model identification of a palm oil biodiesel engine using multi-objective optimization differential evolution

    NASA Astrophysics Data System (ADS)

    Mansor, Zakwan; Zakaria, Mohd Zakimi; Nor, Azuwir Mohd; Saad, Mohd Sazli; Ahmad, Robiah; Jamaluddin, Hishamuddin

    2017-09-01

    This paper presents the black-box modelling of palm oil biodiesel engine (POB) using multi-objective optimization differential evolution (MOODE) algorithm. Two objective functions are considered in the algorithm for optimization; minimizing the number of term of a model structure and minimizing the mean square error between actual and predicted outputs. The mathematical model used in this study to represent the POB system is nonlinear auto-regressive moving average with exogenous input (NARMAX) model. Finally, model validity tests are applied in order to validate the possible models that was obtained from MOODE algorithm and lead to select an optimal model.

  14. Validation of Multiple Tools for Flat Plate Photovoltaic Modeling Against Measured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, J.; Whitmore, J.; Blair, N.

    2014-08-01

    This report expands upon a previous work by the same authors, published in the 40th IEEE Photovoltaic Specialists conference. In this validation study, comprehensive analysis is performed on nine photovoltaic systems for which NREL could obtain detailed performance data and specifications, including three utility-scale systems and six commercial scale systems. Multiple photovoltaic performance modeling tools were used to model these nine systems, and the error of each tool was analyzed compared to quality-controlled measured performance data. This study shows that, excluding identified outliers, all tools achieve annual errors within +/-8% and hourly root mean squared errors less than 7% formore » all systems. It is further shown using SAM that module model and irradiance input choices can change the annual error with respect to measured data by as much as 6.6% for these nine systems, although all combinations examined still fall within an annual error range of +/-8.5%. Additionally, a seasonal variation in monthly error is shown for all tools. Finally, the effects of irradiance data uncertainty and the use of default loss assumptions on annual error are explored, and two approaches to reduce the error inherent in photovoltaic modeling are proposed.« less

  15. Assessing eGovernment Systems Success: A Validation of the DeLone and McLean Model of Information Systems Success

    ERIC Educational Resources Information Center

    Wang, Yi-Shun; Liao, Yi-Wen

    2008-01-01

    With the proliferation of the Internet and World Wide Web applications, people are increasingly interacting with government to citizen (G2C) eGovernment systems. It is therefore important to measure the success of G2C eGovernment systems from the citizen's perspective. While general information systems (IS) success models have received much…

  16. Validation and Improvement of Reliability Methods for Air Force Building Systems

    DTIC Science & Technology

    focusing primarily on HVAC systems . This research used contingency analysis to assess the performance of each model for HVAC systems at six Air Force...probabilistic model produced inflated reliability calculations for HVAC systems . In light of these findings, this research employed a stochastic method, a...Nonhomogeneous Poisson Process (NHPP), in an attempt to produce accurate HVAC system reliability calculations. This effort ultimately concluded that

  17. Progress & Frontiers in PV Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deline, Chris; DiOrio, Nick; Jordan, Dirk

    2016-09-12

    PowerPoint slides for a presentation given at Solar Power International 2016. Presentation includes System Advisor Model (SAM) introduction and battery modeling, bifacial PV modules and modeling, shade modeling and module level power electronics (MLPE), degradation rates, and PVWatts updates and validation.

  18. Mathematical modeling of a single stage ultrasonically assisted distillation process.

    PubMed

    Mahdi, Taha; Ahmad, Arshad; Ripin, Adnan; Abdullah, Tuan Amran Tuan; Nasef, Mohamed M; Ali, Mohamad W

    2015-05-01

    The ability of sonication phenomena in facilitating separation of azeotropic mixtures presents a promising approach for the development of more intensified and efficient distillation systems than conventional ones. To expedite the much-needed development, a mathematical model of the system based on conservation principles, vapor-liquid equilibrium and sonochemistry was developed in this study. The model that was founded on a single stage vapor-liquid equilibrium system and enhanced with ultrasonic waves was coded using MATLAB simulator and validated with experimental data for ethanol-ethyl acetate mixture. The effects of both ultrasonic frequency and intensity on the relative volatility and azeotropic point were examined, and the optimal conditions were obtained using genetic algorithm. The experimental data validated the model with a reasonable accuracy. The results of this study revealed that the azeotropic point of the mixture can be totally eliminated with the right combination of sonication parameters and this can be utilized in facilitating design efforts towards establishing a workable ultrasonically intensified distillation system. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Modelling and validation of electromechanical shock absorbers

    NASA Astrophysics Data System (ADS)

    Tonoli, Andrea; Amati, Nicola; Girardello Detoni, Joaquim; Galluzzi, Renato; Gasparin, Enrico

    2013-08-01

    Electromechanical vehicle suspension systems represent a promising substitute to conventional hydraulic solutions. However, the design of electromechanical devices that are able to supply high damping forces without exceeding geometric dimension and mass constraints is a difficult task. All these challenges meet in off-road vehicle suspension systems, where the power density of the dampers is a crucial parameter. In this context, the present paper outlines a particular shock absorber configuration where a suitable electric machine and a transmission mechanism are utilised to meet off-road vehicle requirements. A dynamic model is used to represent the device. Subsequently, experimental tests are performed on an actual prototype to verify the functionality of the damper and validate the proposed model.

  20. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  1. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  2. Linear Multivariable Regression Models for Prediction of Eddy Dissipation Rate from Available Meteorological Data

    NASA Technical Reports Server (NTRS)

    MCKissick, Burnell T. (Technical Monitor); Plassman, Gerald E.; Mall, Gerald H.; Quagliano, John R.

    2005-01-01

    Linear multivariable regression models for predicting day and night Eddy Dissipation Rate (EDR) from available meteorological data sources are defined and validated. Model definition is based on a combination of 1997-2000 Dallas/Fort Worth (DFW) data sources, EDR from Aircraft Vortex Spacing System (AVOSS) deployment data, and regression variables primarily from corresponding Automated Surface Observation System (ASOS) data. Model validation is accomplished through EDR predictions on a similar combination of 1994-1995 Memphis (MEM) AVOSS and ASOS data. Model forms include an intercept plus a single term of fixed optimal power for each of these regression variables; 30-minute forward averaged mean and variance of near-surface wind speed and temperature, variance of wind direction, and a discrete cloud cover metric. Distinct day and night models, regressing on EDR and the natural log of EDR respectively, yield best performance and avoid model discontinuity over day/night data boundaries.

  3. Development and Validation of an Empiric Tool to Predict Favorable Neurologic Outcomes Among PICU Patients.

    PubMed

    Gupta, Punkaj; Rettiganti, Mallikarjuna; Gossett, Jeffrey M; Daufeldt, Jennifer; Rice, Tom B; Wetzel, Randall C

    2018-01-01

    To create a novel tool to predict favorable neurologic outcomes during ICU stay among children with critical illness. Logistic regression models using adaptive lasso methodology were used to identify independent factors associated with favorable neurologic outcomes. A mixed effects logistic regression model was used to create the final prediction model including all predictors selected from the lasso model. Model validation was performed using a 10-fold internal cross-validation approach. Virtual Pediatric Systems (VPS, LLC, Los Angeles, CA) database. Patients less than 18 years old admitted to one of the participating ICUs in the Virtual Pediatric Systems database were included (2009-2015). None. A total of 160,570 patients from 90 hospitals qualified for inclusion. Of these, 1,675 patients (1.04%) were associated with a decline in Pediatric Cerebral Performance Category scale by at least 2 between ICU admission and ICU discharge (unfavorable neurologic outcome). The independent factors associated with unfavorable neurologic outcome included higher weight at ICU admission, higher Pediatric Index of Morality-2 score at ICU admission, cardiac arrest, stroke, seizures, head/nonhead trauma, use of conventional mechanical ventilation and high-frequency oscillatory ventilation, prolonged hospital length of ICU stay, and prolonged use of mechanical ventilation. The presence of chromosomal anomaly, cardiac surgery, and utilization of nitric oxide were associated with favorable neurologic outcome. The final online prediction tool can be accessed at https://soipredictiontool.shinyapps.io/GNOScore/. Our model predicted 139,688 patients with favorable neurologic outcomes in an internal validation sample when the observed number of patients with favorable neurologic outcomes was among 139,591 patients. The area under the receiver operating curve for the validation model was 0.90. This proposed prediction tool encompasses 20 risk factors into one probability to predict favorable neurologic outcome during ICU stay among children with critical illness. Future studies should seek external validation and improved discrimination of this prediction tool.

  4. V-SUIT Model Validation Using PLSS 1.0 Test Results

    NASA Technical Reports Server (NTRS)

    Olthoff, Claas

    2015-01-01

    The dynamic portable life support system (PLSS) simulation software Virtual Space Suit (V-SUIT) has been under development at the Technische Universitat Munchen since 2011 as a spin-off from the Virtual Habitat (V-HAB) project. The MATLAB(trademark)-based V-SUIT simulates space suit portable life support systems and their interaction with a detailed and also dynamic human model, as well as the dynamic external environment of a space suit moving on a planetary surface. To demonstrate the feasibility of a large, system level simulation like V-SUIT, a model of NASA's PLSS 1.0 prototype was created. This prototype was run through an extensive series of tests in 2011. Since the test setup was heavily instrumented, it produced a wealth of data making it ideal for model validation. The implemented model includes all components of the PLSS in both the ventilation and thermal loops. The major components are modeled in greater detail, while smaller and ancillary components are low fidelity black box models. The major components include the Rapid Cycle Amine (RCA) CO2 removal system, the Primary and Secondary Oxygen Assembly (POS/SOA), the Pressure Garment System Volume Simulator (PGSVS), the Human Metabolic Simulator (HMS), the heat exchanger between the ventilation and thermal loops, the Space Suit Water Membrane Evaporator (SWME) and finally the Liquid Cooling Garment Simulator (LCGS). Using the created model, dynamic simulations were performed using same test points also used during PLSS 1.0 testing. The results of the simulation were then compared to the test data with special focus on absolute values during the steady state phases and dynamic behavior during the transition between test points. Quantified simulation results are presented that demonstrate which areas of the V-SUIT model are in need of further refinement and those that are sufficiently close to the test results. Finally, lessons learned from the modelling and validation process are given in combination with implications for the future development of other PLSS models in V-SUIT.

  5. A Wildfire Behavior Modeling System at Los Alamos National Laboratory for Operational Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S.W. Koch; R.G.Balice

    2004-11-01

    To support efforts to protect facilities and property at Los Alamos National Laboratory from damages caused by wildfire, we completed a multiyear project to develop a system for modeling the behavior of wildfires in the Los Alamos region. This was accomplished by parameterizing the FARSITE wildfire behavior model with locally gathered data representing topography, fuels, and weather conditions from throughout the Los Alamos region. Detailed parameterization was made possible by an extensive monitoring network of permanent plots, weather towers, and other data collection facilities. We also incorporated a database of lightning strikes that can be used individually as repeatable ignitionmore » points or can be used as a group in Monte Carlo simulation exercises and in other randomization procedures. The assembled modeling system was subjected to sensitivity analyses and was validated against documented fires, including the Cerro Grande Fire. The resulting modeling system is a valuable tool for research and management. It also complements knowledge based on professional expertise and information gathered from other modeling technologies. However, the modeling system requires frequent updates of the input data layers to produce currently valid results, to adapt to changes in environmental conditions within the Los Alamos region, and to allow for the quick production of model outputs during emergency operations.« less

  6. EDMS Multi-year Validation Plan

    DOT National Transportation Integrated Search

    2001-06-01

    The Emissions and Dispersion Modeling System (EDMS) is the air quality model required for use on airport projects by the Federal Aviation Administration (FAA). This model has continued to be improved and recently has included several important enhanc...

  7. A European benchmarking system to evaluate in-hospital mortality rates in acute coronary syndrome: the EURHOBOP project.

    PubMed

    Dégano, Irene R; Subirana, Isaac; Torre, Marina; Grau, María; Vila, Joan; Fusco, Danilo; Kirchberger, Inge; Ferrières, Jean; Malmivaara, Antti; Azevedo, Ana; Meisinger, Christa; Bongard, Vanina; Farmakis, Dimitros; Davoli, Marina; Häkkinen, Unto; Araújo, Carla; Lekakis, John; Elosua, Roberto; Marrugat, Jaume

    2015-03-01

    Hospital performance models in acute myocardial infarction (AMI) are useful to assess patient management. While models are available for individual countries, mainly US, cross-European performance models are lacking. Thus, we aimed to develop a system to benchmark European hospitals in AMI and percutaneous coronary intervention (PCI), based on predicted in-hospital mortality. We used the EURopean HOspital Benchmarking by Outcomes in ACS Processes (EURHOBOP) cohort to develop the models, which included 11,631 AMI patients and 8276 acute coronary syndrome (ACS) patients who underwent PCI. Models were validated with a cohort of 55,955 European ACS patients. Multilevel logistic regression was used to predict in-hospital mortality in European hospitals for AMI and PCI. Administrative and clinical models were constructed with patient- and hospital-level covariates, as well as hospital- and country-based random effects. Internal cross-validation and external validation showed good discrimination at the patient level and good calibration at the hospital level, based on the C-index (0.736-0.819) and the concordance correlation coefficient (55.4%-80.3%). Mortality ratios (MRs) showed excellent concordance between administrative and clinical models (97.5% for AMI and 91.6% for PCI). Exclusion of transfers and hospital stays ≤1day did not affect in-hospital mortality prediction in sensitivity analyses, as shown by MR concordance (80.9%-85.4%). Models were used to develop a benchmarking system to compare in-hospital mortality rates of European hospitals with similar characteristics. The developed system, based on the EURHOBOP models, is a simple and reliable tool to compare in-hospital mortality rates between European hospitals in AMI and PCI. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Modeling demand for public transit services in rural areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Attaluri, P.; Seneviratne, P.N.; Javid, M.

    1997-05-01

    Accurate estimates of demand are critical for planning, designing, and operating public transit systems. Previous research has demonstrated that the expected demand in rural areas is a function of both demographic and transit system variables. Numerous models have been proposed to describe the relationship between the aforementioned variables. However, most of them are site specific and their validity over time and space is not reported or perhaps has not been tested. Moreover, input variables in some cases are extremely difficult to quantify. In this article, the estimation of demand using the generalized linear modeling technique is discussed. Two separate models,more » one for fixed-route and another for demand-responsive services, are presented. These models, calibrated with data from systems in nine different states, are used to demonstrate the appropriateness and validity of generalized linear models compared to the regression models. They explain over 70% of the variation in expected demand for fixed-route services and 60% of the variation in expected demand for demand-responsive services. It was found that the models are spatially transferable and that data for calibration are easily obtainable.« less

  9. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    The research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to: 1) Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation. 2) Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator. 3) Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the resultsmore » to improve understand of proppant flow and transport. 4) Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production. 5) Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include: 1) A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS, 2) Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock, 3) Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications, and 4) Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  10. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    2013-12-31

    This research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to; Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation; Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator; Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the results to improve understandmore » of proppant flow and transport; Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production; and Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include; A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS; Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock; Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications; and Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  11. A new class of enhanced kinetic sampling methods for building Markov state models

    NASA Astrophysics Data System (ADS)

    Bhoutekar, Arti; Ghosh, Susmita; Bhattacharya, Swati; Chatterjee, Abhijit

    2017-10-01

    Markov state models (MSMs) and other related kinetic network models are frequently used to study the long-timescale dynamical behavior of biomolecular and materials systems. MSMs are often constructed bottom-up using brute-force molecular dynamics (MD) simulations when the model contains a large number of states and kinetic pathways that are not known a priori. However, the resulting network generally encompasses only parts of the configurational space, and regardless of any additional MD performed, several states and pathways will still remain missing. This implies that the duration for which the MSM can faithfully capture the true dynamics, which we term as the validity time for the MSM, is always finite and unfortunately much shorter than the MD time invested to construct the model. A general framework that relates the kinetic uncertainty in the model to the validity time, missing states and pathways, network topology, and statistical sampling is presented. Performing additional calculations for frequently-sampled states/pathways may not alter the MSM validity time. A new class of enhanced kinetic sampling techniques is introduced that aims at targeting rare states/pathways that contribute most to the uncertainty so that the validity time is boosted in an effective manner. Examples including straightforward 1D energy landscapes, lattice models, and biomolecular systems are provided to illustrate the application of the method. Developments presented here will be of interest to the kinetic Monte Carlo community as well.

  12. Use case driven approach to develop simulation model for PCS of APR1400 simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang

    2006-07-01

    The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less

  13. Fire Detection Organizing Questions

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Verified models of fire precursor transport in low and partial gravity: a. Development of models for large-scale transport in reduced gravity. b. Validated CFD simulations of transport of fire precursors. c. Evaluation of the effect of scale on transport and reduced gravity fires. Advanced fire detection system for gaseous and particulate pre-fire and fire signaturesa: a. Quantification of pre-fire pyrolysis products in microgravity. b. Suite of gas and particulate sensors. c. Reduced gravity evaluation of candidate detector technologies. d. Reduced gravity verification of advanced fire detection system. e. Validated database of fire and pre-fire signatures in low and partial gravity.

  14. Investigation of Advanced Counterrotation Blade Configuration Concepts for High Speed Turboprop Systems. Task 8: Cooling Flow/heat Transfer Analysis

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Topp, David A.; Heidegger, Nathan J.; Delaney, Robert A.

    1994-01-01

    The focus of this task was to validate the ADPAC code for heat transfer calculations. To accomplish this goal, the ADPAC code was modified to allow for a Cartesian coordinate system capability and to add boundary conditions to handle spanwise periodicity and transpiration boundaries. The primary validation case was the film cooled C3X vane. The cooling hole modeling included both a porous region and grid in each discrete hold. Predictions for these models as well as smooth wall compared well with the experimental data.

  15. Human performance measurement: Validation procedures applicable to advanced manned telescience systems

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.

    1990-01-01

    As telescience systems become more and more complex, autonomous, and opaque to their operators it becomes increasingly difficult to determine whether the total system is performing as it should. Some of the complex and interrelated human performance measurement issues are addressed as they relate to total system validation. The assumption is made that human interaction with the automated system will be required well into the Space Station Freedom era. Candidate human performance measurement-validation techniques are discussed for selected ground-to-space-to-ground and space-to-space situations. Most of these measures may be used in conjunction with an information throughput model presented elsewhere (Haines, 1990). Teleoperations, teleanalysis, teleplanning, teledesign, and teledocumentation are considered, as are selected illustrative examples of space related telescience activities.

  16. Validation of the 1/12 degrees Arctic Cap Nowcast/Forecast System (ACNFS)

    DTIC Science & Technology

    2010-11-04

    IBM Power 6 ( Davinci ) at NAVOCEANO with a 2 hr time step for the ice model and a 30 min time step for the ocean model. All model boundaries are...run using 320 processors on the Navy DSRC IBM Power 6 ( Davinci ) at NAVOCEANO. A typical one-day hindcast takes approximately 1.0 wall clock hour...meter. As more observations become available, further studies of ice draft will be used as a validation tool . The IABP program archived 102 Argos

  17. Validation of the 1/12 deg Arctic Cap Nowcast/Forecast System (ACNFS)

    DTIC Science & Technology

    2010-11-04

    IBM Power 6 ( Davinci ) at NAVOCEANO with a 2 hr time step for the ice model and a 30 min time step for the ocean model. All model boundaries are...run using 320 processors on the Navy DSRC IBM Power 6 ( Davinci ) at NAVOCEANO. A typical one-day hindcast takes approximately 1.0 wall clock hour...meter. As more observations become available, further studies of ice draft will be used as a validation tool . The IABP program archived 102 Argos

  18. A Model-based Approach to Controlling the ST-5 Constellation Lights-Out Using the GMSEC Message Bus and Simulink

    NASA Technical Reports Server (NTRS)

    Witt, Kenneth J.; Stanley, Jason; Shendock, Robert; Mandl, Daniel

    2005-01-01

    Space Technology 5 (ST-5) is a three-satellite constellation, technology validation mission under the New Millennium Program at NASA to be launched in March 2006. One of the key technologies to be validated is a lights-out, model-based operations approach to be used for one week to control the ST-5 constellation with no manual intervention. The ground architecture features the GSFC Mission Services Evolution Center (GMSEC) middleware, which allows easy plugging in of software components and a standardized messaging protocol over a software bus. A predictive modeling tool built on MatLab's Simulink software package makes use of the GMSEC standard messaging protocol to interface to the Advanced Mission Planning System (AMPS) Scenario Scheduler which controls all activities, resource allocation and real-time re-profiling of constellation resources when non-nominal events occur. The key features of this system, which we refer to as the ST-5 Simulink system, are as follows: Original daily plan is checked to make sure that predicted resources needed are available by comparing the plan against the model. As the plan is run in real-time, the system re-profiles future activities in real-time if planned activities do not occur in the predicted timeframe or fashion. Alert messages are sent out on the GMSEC bus by the system if future predicted problems are detected. This will allow the Scenario Scheduler to correct the situation before the problem happens. The predictive model is evolved automatically over time via telemetry updates thus reducing the cost of implementing and maintaining the models by an order of magnitude from previous efforts at GSFC such as the model-based system built for MAP in the mid-1990's. This paper will describe the key features, lessons learned and implications for future missions once this system is successfully validated on-orbit in 2006.

  19. The Motivational Value Systems Questionnaire (MVSQ): Psychometric Analysis Using a Forced Choice Thurstonian IRT Model

    PubMed Central

    Merk, Josef; Schlotz, Wolff; Falter, Thomas

    2017-01-01

    This study presents a new measure of value systems, the Motivational Value Systems Questionnaire (MVSQ), which is based on a theory of value systems by psychologist Clare W. Graves. The purpose of the instrument is to help people identify their personal hierarchies of value systems and thus become more aware of what motivates and demotivates them in work-related contexts. The MVSQ is a forced-choice (FC) measure, making it quicker to complete and more difficult to intentionally distort, but also more difficult to assess its psychometric properties due to ipsativity of FC data compared to rating scales. To overcome limitations of ipsative data, a Thurstonian IRT (TIRT) model was fitted to the questionnaire data, based on a broad sample of N = 1,217 professionals and students. Comparison of normative (IRT) scale scores and ipsative scores suggested that MVSQ IRT scores are largely freed from restrictions due to ipsativity and thus allow interindividual comparison of scale scores. Empirical reliability was estimated using a sample-based simulation approach which showed acceptable and good estimates and, on average, slightly higher test-retest reliabilities. Further, validation studies provided evidence on both construct validity and criterion-related validity. Scale score correlations and associations of scores with both age and gender were largely in line with theoretically- and empirically-based expectations, and results of a multitrait-multimethod analysis supports convergent and discriminant construct validity. Criterion validity was assessed by examining the relation of value system preferences to departmental affiliation which revealed significant relations in line with prior hypothesizing. These findings demonstrate the good psychometric properties of the MVSQ and support its application in the assessment of value systems in work-related contexts. PMID:28979228

  20. The Motivational Value Systems Questionnaire (MVSQ): Psychometric Analysis Using a Forced Choice Thurstonian IRT Model.

    PubMed

    Merk, Josef; Schlotz, Wolff; Falter, Thomas

    2017-01-01

    This study presents a new measure of value systems, the Motivational Value Systems Questionnaire (MVSQ), which is based on a theory of value systems by psychologist Clare W. Graves. The purpose of the instrument is to help people identify their personal hierarchies of value systems and thus become more aware of what motivates and demotivates them in work-related contexts. The MVSQ is a forced-choice (FC) measure, making it quicker to complete and more difficult to intentionally distort, but also more difficult to assess its psychometric properties due to ipsativity of FC data compared to rating scales. To overcome limitations of ipsative data, a Thurstonian IRT (TIRT) model was fitted to the questionnaire data, based on a broad sample of N = 1,217 professionals and students. Comparison of normative (IRT) scale scores and ipsative scores suggested that MVSQ IRT scores are largely freed from restrictions due to ipsativity and thus allow interindividual comparison of scale scores. Empirical reliability was estimated using a sample-based simulation approach which showed acceptable and good estimates and, on average, slightly higher test-retest reliabilities. Further, validation studies provided evidence on both construct validity and criterion-related validity. Scale score correlations and associations of scores with both age and gender were largely in line with theoretically- and empirically-based expectations, and results of a multitrait-multimethod analysis supports convergent and discriminant construct validity. Criterion validity was assessed by examining the relation of value system preferences to departmental affiliation which revealed significant relations in line with prior hypothesizing. These findings demonstrate the good psychometric properties of the MVSQ and support its application in the assessment of value systems in work-related contexts.

  1. Convergent, discriminant, and criterion validity of DSM-5 traits.

    PubMed

    Yalch, Matthew M; Hopwood, Christopher J

    2016-10-01

    Section III of the Diagnostic and Statistical Manual of Mental Disorders (5th edi.; DSM-5; American Psychiatric Association, 2013) contains a system for diagnosing personality disorder based in part on assessing 25 maladaptive traits. Initial research suggests that this aspect of the system improves the validity and clinical utility of the Section II Model. The Computer Adaptive Test of Personality Disorder (CAT-PD; Simms et al., 2011) contains many similar traits as the DSM-5, as well as several additional traits seemingly not covered in the DSM-5. In this study we evaluate the convergent and discriminant validity between the DSM-5 traits, as assessed by the Personality Inventory for DSM-5 (PID-5; Krueger et al., 2012), and CAT-PD in an undergraduate sample, and test whether traits included in the CAT-PD but not the DSM-5 provide incremental validity in association with clinically relevant criterion variables. Results supported the convergent and discriminant validity of the PID-5 and CAT-PD scales in their assessment of 23 out of 25 DSM-5 traits. DSM-5 traits were consistently associated with 11 criterion variables, despite our having intentionally selected clinically relevant criterion constructs not directly assessed by DSM-5 traits. However, the additional CAT-PD traits provided incremental information above and beyond the DSM-5 traits for all criterion variables examined. These findings support the validity of pathological trait models in general and the DSM-5 and CAT-PD models in particular, while also suggesting that the CAT-PD may include additional traits for consideration in future iterations of the DSM-5 system. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. OFMspert: An architecture for an operator's associate that evolves to an intelligent tutor

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1991-01-01

    With the emergence of new technology for both human-computer interaction and knowledge-based systems, a range of opportunities exist which enhance the effectiveness and efficiency of controllers of high-risk engineering systems. The design of an architecture for an operator's associate is described. This associate is a stand-alone model-based system designed to interact with operators of complex dynamic systems, such as airplanes, manned space systems, and satellite ground control systems in ways comparable to that of a human assistant. The operator function model expert system (OFMspert) architecture and the design and empirical validation of OFMspert's understanding component are described. The design and validation of OFMspert's interactive and control components are also described. A description of current work in which OFMspert provides the foundation in the development of an intelligent tutor that evolves to an assistant, as operator expertise evolves from novice to expert, is provided.

  3. A clinical prognostic model compared to the newly adopted UICC staging in an independent validation cohort of P16 negative/positive head and neck cancer patients.

    PubMed

    Rasmussen, Jacob H; Håkansson, Katrin; Rasmussen, Gregers B; Vogelius, Ivan R; Friborg, Jeppe; Fischer, Barbara M; Bentzen, Søren M; Specht, Lena

    2018-06-01

    A previously published prognostic model in patients with head and neck squamous cell carcinoma (HNSCC) was validated in both a p16-negative and a p16-positive independent patient cohort and the performance was compared with the newly adopted 8th edition of the UICC staging system. Consecutive patients with HNSCC treated at a single institution from 2005 to 2012 were included. The cohort was divided in three. 1.) Training cohort, patients treated from 2005 to 2009 excluding patients with p16-positive oropharyngeal squamous cell carcinomas (OPSCC); 2.) A p16-negative validation cohort and 3.) A p16-positive validation cohort. A previously published prognostic model (clinical model) with the significant covariates (smoking status, FDG uptake, and tumor volume) was refitted in the training cohort and validated in the two validation cohorts. The clinical model was used to generate four risk groups based on the predicted risk of disease recurrence after 2 years and the performance was compared with UICC staging 8th edition using concordance index. Overall 568 patients were included. Compared to UICC the clinical model had a significantly better concordance index in the p16-negative validation cohort (AUC = 0.63 for UICC and AUC = 0.73 for the clinical model; p = 0.003) and a borderline significantly better concordance index in the p16-positive cohort (AUC = 0.63 for UICC and 0.72 for the clinical model; p = 0.088). The validated clinical model provided a better prognostication of risk of disease recurrence than UICC stage in the p16-negative validation cohort, and similar prognostication as the newly adopted 8th edition of the UICC staging in the p16-positive patient cohort. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Development of a model for occipital fixation--validation of an analogue bone material.

    PubMed

    Mullett, H; O'Donnell, T; Felle, P; O'Rourke, K; FitzPatrick, D

    2002-01-01

    Several implant systems may be used to fuse the skull to the upper cervical spine (occipitocervical fusion). Current biomechanical evaluation is restricted by the limitations of human cadaveric specimens. This paper describes the design and validation of a synthetic testing model of the occipital bone. Data from thickness measurement and pull-out strength testing of a series of human cadaveric skulls was used in the design of a high-density rigid polyurethane foam model. The synthetic occipital model demonstrated repeatable and consistent morphological and biomechanical properties. The model provides a standardized environment for evaluation of occipital implants.

  5. Software risk management through independent verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Zhou, Tong C.; Wood, Ralph

    1995-01-01

    Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.

  6. Validation of behave fire behavior predictions in oak savannas

    USGS Publications Warehouse

    Grabner, Keith W.; Dwyer, John; Cutter, Bruce E.

    1997-01-01

    Prescribed fire is a valuable tool in the restoration and management of oak savannas. BEHAVE, a fire behavior prediction system developed by the United States Forest Service, can be a useful tool when managing oak savannas with prescribed fire. BEHAVE predictions of fire rate-of-spread and flame length were validated using four standardized fuel models: Fuel Model 1 (short grass), Fuel Model 2 (timber and grass), Fuel Model 3 (tall grass), and Fuel Model 9 (hardwood litter). Also, a customized oak savanna fuel model (COSFM) was created and validated. Results indicate that standardized fuel model 2 and the COSFM reliably estimate mean rate-of-spread (MROS). The COSFM did not appreciably reduce MROS variation when compared to fuel model 2. Fuel models 1, 3, and 9 did not reliably predict MROS. Neither the standardized fuel models nor the COSFM adequately predicted flame lengths. We concluded that standardized fuel model 2 should be used with BEHAVE when predicting fire rates-of-spread in established oak savannas.

  7. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  8. Development and validation of chemistry agnostic flow battery cost performance model and application to nonaqueous electrolyte systems: Chemistry agnostic flow battery cost performance model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, Alasdair; Thomsen, Edwin; Reed, David

    2016-04-20

    A chemistry agnostic cost performance model is described for a nonaqueous flow battery. The model predicts flow battery performance by estimating the active reaction zone thickness at each electrode as a function of current density, state of charge, and flow rate using measured data for electrode kinetics, electrolyte conductivity, and electrode-specific surface area. Validation of the model is conducted using a 4kW stack data at various current densities and flow rates. This model is used to estimate the performance of a nonaqueous flow battery with electrode and electrolyte properties used from the literature. The optimized cost for this system ismore » estimated for various power and energy levels using component costs provided by vendors. The model allows optimization of design parameters such as electrode thickness, area, flow path design, and operating parameters such as power density, flow rate, and operating SOC range for various application duty cycles. A parametric analysis is done to identify components and electrode/electrolyte properties with the highest impact on system cost for various application durations. A pathway to 100$kWh -1 for the storage system is identified.« less

  9. Surrogates for numerical simulations; optimization of eddy-promoter heat exchangers

    NASA Technical Reports Server (NTRS)

    Patera, Anthony T.; Patera, Anthony

    1993-01-01

    Although the advent of fast and inexpensive parallel computers has rendered numerous previously intractable calculations feasible, many numerical simulations remain too resource-intensive to be directly inserted in engineering optimization efforts. An attractive alternative to direct insertion considers models for computational systems: the expensive simulation is evoked only to construct and validate a simplified, input-output model; this simplified input-output model then serves as a simulation surrogate in subsequent engineering optimization studies. A simple 'Bayesian-validated' statistical framework for the construction, validation, and purposive application of static computer simulation surrogates is presented. As an example, dissipation-transport optimization of laminar-flow eddy-promoter heat exchangers are considered: parallel spectral element Navier-Stokes calculations serve to construct and validate surrogates for the flowrate and Nusselt number; these surrogates then represent the originating Navier-Stokes equations in the ensuing design process.

  10. Validation of Heat Transfer Thermal Decomposition and Container Pressurization of Polyurethane Foam.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Sarah Nicole; Dodd, Amanda B.; Larsen, Marvin E.

    Polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. In fire environments, gas pressure from thermal decomposition of polymers can cause mechanical failure of sealed systems. In this work, a detailed uncertainty quantification study of PMDI-based polyurethane foam is presented to assess the validity of the computational model. Both experimental measurement uncertainty and model prediction uncertainty are examined and compared. Both the mean value method and Latin hypercube sampling approach are used to propagate the uncertainty through the model. In addition to comparing computational and experimental results, the importance of each input parameter on the simulation resultmore » is also investigated. These results show that further development in the physics model of the foam and appropriate associated material testing are necessary to improve model accuracy.« less

  11. Modeling and characterization of multipath in global navigation satellite system ranging signals

    NASA Astrophysics Data System (ADS)

    Weiss, Jan Peter

    The Global Positioning System (GPS) provides position, velocity, and time information to users in anywhere near the earth in real-time and regardless of weather conditions. Since the system became operational, improvements in many areas have reduced systematic errors affecting GPS measurements such that multipath, defined as any signal taking a path other than the direct, has become a significant, if not dominant, error source for many applications. This dissertation utilizes several approaches to characterize and model multipath errors in GPS measurements. Multipath errors in GPS ranging signals are characterized for several receiver systems and environments. Experimental P(Y) code multipath data are analyzed for ground stations with multipath levels ranging from minimal to severe, a C-12 turboprop, an F-18 jet, and an aircraft carrier. Comparisons between receivers utilizing single patch antennas and multi-element arrays are also made. In general, the results show significant reductions in multipath with antenna array processing, although large errors can occur even with this kind of equipment. Analysis of airborne platform multipath shows that the errors tend to be small in magnitude because the size of the aircraft limits the geometric delay of multipath signals, and high in frequency because aircraft dynamics cause rapid variations in geometric delay. A comprehensive multipath model is developed and validated. The model integrates 3D structure models, satellite ephemerides, electromagnetic ray-tracing algorithms, and detailed antenna and receiver models to predict multipath errors. Validation is performed by comparing experimental and simulated multipath via overall error statistics, per satellite time histories, and frequency content analysis. The validation environments include two urban buildings, an F-18, an aircraft carrier, and a rural area where terrain multipath dominates. The validated models are used to identify multipath sources, characterize signal properties, evaluate additional antenna and receiver tracking configurations, and estimate the reflection coefficients of multipath-producing surfaces. Dynamic models for an F-18 landing on an aircraft carrier correlate aircraft dynamics to multipath frequency content; the model also characterizes the separate contributions of multipath due to the aircraft, ship, and ocean to the overall error statistics. Finally, reflection coefficients for multipath produced by terrain are estimated via a least-squares algorithm.

  12. Modeling and validation of photometric characteristics of space targets oriented to space-based observation.

    PubMed

    Wang, Hongyuan; Zhang, Wei; Dong, Aotuo

    2012-11-10

    A modeling and validation method of photometric characteristics of the space target was presented in order to track and identify different satellites effectively. The background radiation characteristics models of the target were built based on blackbody radiation theory. The geometry characteristics of the target were illustrated by the surface equations based on its body coordinate system. The material characteristics of the target surface were described by a bidirectional reflectance distribution function model, which considers the character of surface Gauss statistics and microscale self-shadow and is obtained by measurement and modeling in advance. The contributing surfaces of the target to observation system were determined by coordinate transformation according to the relative position of the space-based target, the background radiation sources, and the observation platform. Then a mathematical model on photometric characteristics of the space target was built by summing reflection components of all the surfaces. Photometric characteristics simulation of the space-based target was achieved according to its given geometrical dimensions, physical parameters, and orbital parameters. Experimental validation was made based on the scale model of the satellite. The calculated results fit well with the measured results, which indicates the modeling method of photometric characteristics of the space target is correct.

  13. Advanced simulation model for IPM motor drive with considering phase voltage and stator inductance

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Myung; Park, Hyun-Jong; Lee, Ju

    2016-10-01

    This paper proposes an advanced simulation model of driving system for Interior Permanent Magnet (IPM) BrushLess Direct Current (BLDC) motors driven by 120-degree conduction method (two-phase conduction method, TPCM) that is widely used for sensorless control of BLDC motors. BLDC motors can be classified as SPM (Surface mounted Permanent Magnet) and IPM motors. Simulation model of driving system with SPM motors is simple due to the constant stator inductance regardless of the rotor position. Simulation models of SPM motor driving system have been proposed in many researches. On the other hand, simulation models for IPM driving system by graphic-based simulation tool such as Matlab/Simulink have not been proposed. Simulation study about driving system of IPMs with TPCM is complex because stator inductances of IPM vary with the rotor position, as permanent magnets are embedded in the rotor. To develop sensorless scheme or improve control performance, development of control algorithm through simulation study is essential, and the simulation model that accurately reflects the characteristic of IPM is required. Therefore, this paper presents the advanced simulation model of IPM driving system, which takes into account the unique characteristic of IPM due to the position-dependent inductances. The validity of the proposed simulation model is validated by comparison to experimental and simulation results using IPM with TPCM control scheme.

  14. International System and Foreign Policy Approaches: Implications for Conflict Modelling and Management

    DTIC Science & Technology

    tool for conflict management , preliminary version of which is the Computer Aided Conflict Information System. Using expert judgments to describe...1961. The combined model is more relevant during the crisis phase. The results have implications for conflict modelling. With respect to conflict ... management , there is an important implication. Since the organizational processes model may be more valid than an event interaction model, then conflict

  15. Liquid Oxygen/Liquid Methane Integrated Propulsion System Test Bed

    NASA Technical Reports Server (NTRS)

    Flynn, Howard; Lusby, Brian; Villemarette, Mark

    2011-01-01

    In support of NASA?s Propulsion and Cryogenic Advanced Development (PCAD) project, a liquid oxygen (LO2)/liquid methane (LCH4) Integrated Propulsion System Test Bed (IPSTB) was designed and advanced to the Critical Design Review (CDR) stage at the Johnson Space Center. The IPSTB?s primary objectives are to study LO2/LCH4 propulsion system steady state and transient performance, operational characteristics and to validate fluid and thermal models of a LO2/LCH4 propulsion system for use in future flight design work. Two phase thermal and dynamic fluid flow models of the IPSTB were built to predict the system performance characteristics under a variety of operating modes and to aid in the overall system design work. While at ambient temperature and simulated altitude conditions at the White Sands Test Facility, the IPSTB and its approximately 600 channels of system instrumentation would be operated to perform a variety of integrated main engine and reaction control engine hot fire tests. The pressure, temperature, and flow rate data collected during this testing would then be used to validate the analytical models of the IPSTB?s thermal and dynamic fluid flow performance. An overview of the IPSTB design and analytical model development will be presented.

  16. Validation of the BASALT model for simulating off-axis hydrothermal circulation in oceanic crust

    NASA Astrophysics Data System (ADS)

    Farahat, Navah X.; Archer, David; Abbot, Dorian S.

    2017-08-01

    Fluid recharge and discharge between the deep ocean and the porous upper layer of off-axis oceanic crust tends to concentrate in small volumes of rock, such as seamounts and fractures, that are unimpeded by low-permeability sediments. Basement structure, sediment burial, heat flow, and other regional characteristics of off-axis hydrothermal systems appear to produce considerable diversity of circulation behaviors. Circulation of seawater and seawater-derived fluids controls the extent of fluid-rock interaction, resulting in significant geochemical impacts. However, the primary regional characteristics that control how seawater is distributed within upper oceanic crust are still poorly understood. In this paper we present the details of the two-dimensional (2-D) BASALT (Basement Activity Simulated At Low Temperatures) numerical model of heat and fluid transport in an off-axis hydrothermal system. This model is designed to simulate a wide range of conditions in order to explore the dominant controls on circulation. We validate the BASALT model's ability to reproduce observations by configuring it to represent a thoroughly studied transect of the Juan de Fuca Ridge eastern flank. The results demonstrate that including series of narrow, ridge-parallel fractures as subgrid features produces a realistic circulation scenario at the validation site. In future projects, a full reactive transport version of the validated BASALT model will be used to explore geochemical fluxes in a variety of off-axis hydrothermal environments.

  17. Improving Water Level and Soil Moisture Over Peatlands in a Global Land Modeling System

    NASA Technical Reports Server (NTRS)

    Bechtold, M.; De Lannoy, G. J. M.; Roose, D.; Reichle, R. H.; Koster, R. D.; Mahanama, S. P.

    2017-01-01

    New model structure for peatlands results in improved skill metrics (without any parameter calibration) Simulated surface soil moisture strongly affected by new model, but reliable soil moisture data lacking for validation.

  18. Development of a Twin-spool Turbofan Engine Simulation Using the Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia M.; Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Johathan S.

    2014-01-01

    The Toolbox for Modeling and Analysis of Thermodynamic Systems (T-MATS) is a tool that has been developed to allow a user to build custom models of systems governed by thermodynamic principles using a template to model each basic process. Validation of this tool in an engine model application was performed through reconstruction of the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) (v2) using the building blocks from the T-MATS (v1) library. In order to match the two engine models, it was necessary to address differences in several assumptions made in the two modeling approaches. After these modifications were made, validation of the engine model continued by integrating both a steady-state and dynamic iterative solver with the engine plant and comparing results from steady-state and transient simulation of the T-MATS and C-MAPSS models. The results show that the T-MATS engine model was accurate within 3 of the C-MAPSS model, with inaccuracy attributed to the increased dimension of the iterative solver solution space required by the engine model constructed using the T-MATS library. This demonstrates that, given an understanding of the modeling assumptions made in T-MATS and a baseline model, the T-MATS tool provides a viable option for constructing a computational model of a twin-spool turbofan engine that may be used in simulation studies.

  19. Development of a Three-Dimensional, Unstructured Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph C.; Stern, Eric C.; Muppidi, Suman; Palmer, Grant E.; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. This extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  20. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  1. Variational Iterative Refinement Source Term Estimation Algorithm Assessment for Rural and Urban Environments

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.

    2016-12-01

    It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.

  2. Modeling of GIC Impacts in Different Time Scales, and Validation with Measurement Data

    NASA Astrophysics Data System (ADS)

    Shetye, K.; Birchfield, A.; Overbye, T. J.; Gannon, J. L.

    2016-12-01

    Geomagnetically induced currents (GICs) have mostly been associated with geomagnetic disturbances (GMDs) originating from natural events such as solar coronal mass ejections. There is another, man-made, phenomenon that can induce GICs in the bulk power grid. Detonation of nuclear devices at high altitudes can give rise to electromagnetic pulses (EMPs) that induce electric fields at the earth's surface. EMPs cause three types of waves on different time scales, the slowest of which, E3, can induce GICs similar to the way GMDs do. The key difference between GMDs and EMPs is the rise time of the associated electric field. E3 electric fields are in the msec. to sec. range, whereas GMD electric fields are slower (sec. to min.). Similarly, the power grid and its components also operate and respond to disturbances in various time frames, right from electromagnetic transients (eg. lightning propagation) in the micro second range to steady state power flow ( hours). Hence, different power system component models need to be used to analyze the impacts of GICs caused by GMDs, and EMPs. For instance, for the slower GMD based GICs, a steady-state (static) analysis of the system is sufficient. That is, one does not need to model the dynamic components of a power system, such as the rotating machine of a generator, or generator controls such as exciters, etc. The latter become important in the case of an E3 EMP wave, which falls in the power system transient stability time frame of msec. to sec. This talk will first give an overview of the different time scales and models associated with power system operations, and where GMD and EMPs fit in. This is helpful to develop appropriate system models and test systems for analyzing impacts of GICs from various sources, and developing mitigation measures. Example test systems developed for GMD and EMP analysis, and their key modeling and analysis differences will be presented. After the modeling is discussed, results of validating simulated GICs with GIC measurements from a utility for a recent moderate GMD event will be shown, using NSF Earthscope derived electric fields. The end goal is to validate 1) power system models used for GICs, and 2) ground models to see whether 3D ground models provide better results than the hitherto-used 1D ground models.

  3. Modeling effectiveness of management practices for flood mitigation using GIS spatial analysis functions in Upper Cilliwung watershed

    NASA Astrophysics Data System (ADS)

    Darma Tarigan, Suria

    2016-01-01

    Flooding is caused by excessive rainfall flowing downstream as cumulative surface runoff. Flooding event is a result of complex interaction of natural system components such as rainfall events, land use, soil, topography and channel characteristics. Modeling flooding event as a result of interaction of those components is a central theme in watershed management. The model is usually used to test performance of various management practices in flood mitigation. There are various types of management practices for flood mitigation including vegetative and structural management practices. Existing hydrological model such as SWAT and HEC-HMS models have limitation to accommodate discrete management practices such as infiltration well, small farm reservoir, silt pits in its analysis due to the lumped structure of these models. Aim of this research is to use raster spatial analysis functions of Geo-Information System (RGIS-HM) to model flooding event in Ciliwung watershed and to simulate impact of discrete management practices on surface runoff reduction. The model was validated using flooding data event of Ciliwung watershed on 29 January 2004. The hourly hydrograph data and rainfall data were available during period of model validation. The model validation provided good result with Nash-Suthcliff efficiency of 0.8. We also compared the RGIS-HM with Netlogo Hydrological Model (NL-HM). The RGIS-HM has similar capability with NL-HM in simulating discrete management practices in watershed scale.

  4. Enhanced TCAS 2/CDTI traffic Sensor digital simulation model and program description

    NASA Technical Reports Server (NTRS)

    Goka, T.

    1984-01-01

    Digital simulation models of enhanced TCAS 2/CDTI traffic sensors are developed, based on actual or projected operational and performance characteristics. Two enhanced Traffic (or Threat) Alert and Collision Avoidance Systems are considered. A digital simulation program is developed in FORTRAN. The program contains an executive with a semireal time batch processing capability. The simulation program can be interfaced with other modules with a minimum requirement. Both the traffic sensor and CAS logic modules are validated by means of extensive simulation runs. Selected validation cases are discussed in detail, and capabilities and limitations of the actual and simulated systems are noted. The TCAS systems are not specifically intended for Cockpit Display of Traffic Information (CDTI) applications. These systems are sufficiently general to allow implementation of CDTI functions within the real systems' constraints.

  5. Comparison of LIDAR system performance for alternative single-mode receiver architectures: modeling and experimental validation

    NASA Astrophysics Data System (ADS)

    Toliver, Paul; Ozdur, Ibrahim; Agarwal, Anjali; Woodward, T. K.

    2013-05-01

    In this paper, we describe a detailed performance comparison of alternative single-pixel, single-mode LIDAR architectures including (i) linear-mode APD-based direct-detection, (ii) optically-preamplified PIN receiver, (iii) PINbased coherent-detection, and (iv) Geiger-mode single-photon-APD counting. Such a comparison is useful when considering next-generation LIDAR on a chip, which would allow one to leverage extensive waveguide-based structures and processing elements developed for telecom and apply them to small form-factor sensing applications. Models of four LIDAR transmit and receive systems are described in detail, which include not only the dominant sources of receiver noise commonly assumed in each of the four detection limits, but also additional noise terms present in realistic implementations. These receiver models are validated through the analysis of detection statistics collected from an experimental LIDAR testbed. The receiver is reconfigurable into four modes of operation, while transmit waveforms and channel characteristics are held constant. The use of a diffuse hard target highlights the importance of including speckle noise terms in the overall system analysis. All measurements are done at 1550 nm, which offers multiple system advantages including less stringent eye safety requirements and compatibility with available telecom components, optical amplification, and photonic integration. Ultimately, the experimentally-validated detection statistics can be used as part of an end-to-end system model for projecting rate, range, and resolution performance limits and tradeoffs of alternative integrated LIDAR architectures.

  6. Towards Verification and Validation for Increased Autonomy

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  7. NASA GPM GV Science Implementation

    NASA Technical Reports Server (NTRS)

    Petersen, W. A.

    2009-01-01

    Pre-launch algorithm development & post-launch product evaluation: The GPM GV paradigm moves beyond traditional direct validation/comparison activities by incorporating improved algorithm physics & model applications (end-to-end validation) in the validation process. Three approaches: 1) National Network (surface): Operational networks to identify and resolve first order discrepancies (e.g., bias) between satellite and ground-based precipitation estimates. 2) Physical Process (vertical column): Cloud system and microphysical studies geared toward testing and refinement of physically-based retrieval algorithms. 3) Integrated (4-dimensional): Integration of satellite precipitation products into coupled prediction models to evaluate strengths/limitations of satellite precipitation producers.

  8. Assessing the performance of community-available global MHD models using key system parameters and empirical relationships

    NASA Astrophysics Data System (ADS)

    Gordeev, E.; Sergeev, V.; Honkonen, I.; Kuznetsova, M.; Rastätter, L.; Palmroth, M.; Janhunen, P.; Tóth, G.; Lyon, J.; Wiltberger, M.

    2015-12-01

    Global magnetohydrodynamic (MHD) modeling is a powerful tool in space weather research and predictions. There are several advanced and still developing global MHD (GMHD) models that are publicly available via Community Coordinated Modeling Center's (CCMC) Run on Request system, which allows the users to simulate the magnetospheric response to different solar wind conditions including extraordinary events, like geomagnetic storms. Systematic validation of GMHD models against observations still continues to be a challenge, as well as comparative benchmarking of different models against each other. In this paper we describe and test a new approach in which (i) a set of critical large-scale system parameters is explored/tested, which are produced by (ii) specially designed set of computer runs to simulate realistic statistical distributions of critical solar wind parameters and are compared to (iii) observation-based empirical relationships for these parameters. Being tested in approximately similar conditions (similar inputs, comparable grid resolution, etc.), the four models publicly available at the CCMC predict rather well the absolute values and variations of those key parameters (magnetospheric size, magnetic field, and pressure) which are directly related to the large-scale magnetospheric equilibrium in the outer magnetosphere, for which the MHD is supposed to be a valid approach. At the same time, the models have systematic differences in other parameters, being especially different in predicting the global convection rate, total field-aligned current, and magnetic flux loading into the magnetotail after the north-south interplanetary magnetic field turning. According to validation results, none of the models emerges as an absolute leader. The new approach suggested for the evaluation of the models performance against reality may be used by model users while planning their investigations, as well as by model developers and those interesting to quantitatively evaluate progress in magnetospheric modeling.

  9. Calibration and Validation of the COCOMO II.1997.0 Cost/Schedule Estimating Model to the Space and Missile Systems Center Database

    DTIC Science & Technology

    1997-09-01

    Daly chose five models (REVIC, PRICE-S, SEER, System-4, and SPQR /20) to estimate schedule for 21 separate projects from the Electronic System Division...PRICE-S, two variants of COCOMO, System-3, SPQR /20, SASET, SoftCost-Ada) to 11 eight Ada specific programs. Ada was specifically designed for and is

  10. Using Maxwell's Demon to Tame the "Devil in the Details" that are Encountered During System Development

    NASA Technical Reports Server (NTRS)

    Richardson, David

    2018-01-01

    Model-Based Systems Engineering (MBSE) is the formalized application of modeling to support system requirements, design, analysis, verification and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases . This presentation will discuss the value proposition that MBSE has for Systems Engineering, and the associated culture change needed to adopt it.

  11. A quantitative dynamic systems model of health-related quality of life among older adults

    PubMed Central

    Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela

    2015-01-01

    Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722

  12. Validation of periodontitis screening model using sociodemographic, systemic, and molecular information in a Korean population.

    PubMed

    Kim, Hyun-Duck; Sukhbaatar, Munkhzaya; Shin, Myungseop; Ahn, Yoo-Been; Yoo, Wook-Sung

    2014-12-01

    This study aims to evaluate and validate a periodontitis screening model that includes sociodemographic, metabolic syndrome (MetS), and molecular information, including gingival crevicular fluid (GCF), matrix metalloproteinase (MMP), and blood cytokines. The authors selected 506 participants from the Shiwha-Banwol cohort: 322 participants from the 2005 cohort for deriving the screening model and 184 participants from the 2007 cohort for its validation. Periodontitis was assessed by dentists using the community periodontal index. Interleukin (IL)-6, IL-8, and tumor necrosis factor-α in blood and MMP-8, -9, and -13 in GCF were assayed using enzyme-linked immunosorbent assay. MetS was assessed by physicians using physical examination and blood laboratory data. Information about age, sex, income, smoking, and drinking was obtained by interview. Logistic regression analysis was applied to finalize the best-fitting model and validate the model using sensitivity, specificity, and c-statistics. The derived model for periodontitis screening had a sensitivity of 0.73, specificity of 0.85, and c-statistic of 0.86 (P <0.001); those of the validated model were 0.64, 0.91, and 0.83 (P <0.001), respectively. The model that included age, sex, income, smoking, drinking, and blood and GCF biomarkers could be useful in screening for periodontitis. A future prospective study is indicated for evaluating this model's ability to predict the occurrence of periodontitis.

  13. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  14. Validation of the Activities of Community Transportation model for individuals with cognitive impairments.

    PubMed

    Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Hung, Pei-Fang

    2009-01-01

    To develop a theoretical, functional model of community navigation for individuals with cognitive impairments: the Activities of Community Transportation (ACTs). Iterative design using qualitative methods (i.e. document review, focus groups and observations). Four agencies providing travel training to adults with cognitive impairments in the USA participated in the validation study. A thorough document review and series of focus groups led to the development of a comprehensive model (ACTs Wheels) delineating the requisite steps and skills for community navigation. The model was validated and updated based on observations of 395 actual trips by travellers with navigational challenges from the four participating agencies. Results revealed that the 'ACTs Wheel' models were complete and comprehensive. The 'ACTs Wheels' represent a comprehensive model of the steps needed to navigate to destinations using paratransit and fixed-route public transportation systems for travellers with cognitive impairments. Suggestions are made for future investigations of community transportation for this population.

  15. Development and validation of a predictive model for the influences of selected product and process variables on ascorbic acid degradation in simulated fruit juice.

    PubMed

    Gabriel, Alonzo A; Cayabyab, Jochelle Elysse C; Tan, Athalie Kaye L; Corook, Mark Lester F; Ables, Errol John O; Tiangson-Bayaga, Cecile Leah P

    2015-06-15

    A predictive response surface model for the influences of product (soluble solids and titratable acidity) and process (temperature and heating time) parameters on the degradation of ascorbic acid (AA) in heated simulated fruit juices (SFJs) was established. Physicochemical property ranges of freshly squeezed and processed juices, and a previously established decimal reduction times of Escherichiacoli O157:H7 at different heating temperatures were used in establishing a Central Composite Design of Experiment that determined the combinations of product and process variable used in the model building. Only the individual linear effects of temperature and heating time significantly (P<0.05) affected AA reduction (%AAr). Validating systems either over- or underestimated actual %AAr with bias factors 0.80-1.20. However, all validating systems still resulted in acceptable predictive efficacy, with accuracy factor 1.00-1.26. The model may be useful in establishing unique process schedules for specific products, for the simultaneous control and improvement of food safety and quality. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Development of speed models for improving travel forecasting and highway performance evaluation : [technical summary].

    DOT National Transportation Integrated Search

    2013-12-01

    Travel forecasting models predict travel demand based on the present transportation system and its use. Transportation modelers must develop, validate, and calibrate models to ensure that predicted travel demand is as close to reality as possible. Mo...

  17. 6DOF Testing of the SLS Inertial Navigation Unit

    NASA Technical Reports Server (NTRS)

    Geohagan, Kevin; Bernard, Bill; Oliver, T. Emerson; Leggett, Jared; Strickland, Dennis

    2018-01-01

    The Navigation System on the NASA Space Launch System (SLS) Block 1 vehicle performs initial alignment of the Inertial Navigation System (INS) navigation frame through gyrocompass alignment (GCA). Because the navigation architecture for the SLS Block 1 vehicle is a purely inertial system, the accuracy of the achieved orbit relative to mission requirements is very sensitive to initial alignment accuracy. The assessment of this sensitivity and many others via simulation is a part of the SLS Model-Based Design and Model-Based Requirements approach. As a part of the aforementioned, 6DOF Monte Carlo simulation is used in large part to develop and demonstrate verification of program requirements. To facilitate this and the GN&C flight software design process, an SLS-Program-controlled Design Math Model (DMM) of the SLS INS was developed by the SLS Navigation Team. The SLS INS model implements all of the key functions of the hardware-namely, GCA, inertial navigation, and FDIR (Fault Detection, Isolation, and Recovery)-in support of SLS GN&C design requirements verification. Despite the strong sensitivity to initial alignment, GCA accuracy requirements were not verified by test due to program cost and schedule constraints. Instead, the system relies upon assessments performed using the SLS INS model. In order to verify SLS program requirements by analysis, the SLS INS model is verified and validated against flight hardware. In lieu of direct testing of GCA accuracy in support of requirement verification, the SLS Navigation Team proposed and conducted an engineering test to, among other things, validate the GCA performance and overall behavior of the SLS INS model through comparison with test data. This paper will detail dynamic hardware testing of the SLS INS, conducted by the SLS Navigation Team at Marshall Space Flight Center's 6DOF Table Facility, in support of GCA performance characterization and INS model validation. A 6-DOF motion platform was used to produce 6DOF pad twist and sway dynamics while a simulated SLS flight computer communicated with the INS. Tests conducted include an evaluation of GCA algorithm robustness to increasingly dynamic pad environments, an examination of GCA algorithm stability and accuracy over long durations, and a long-duration static test to gather enough data for Allan Variance analysis. Test setup, execution, and data analysis will be discussed, including analysis performed in support of SLS INS model validation.

  18. Recognition of Atypical Symptoms of Acute Myocardial Infarction: Development and Validation of a Risk Scoring System.

    PubMed

    Li, Polly W C; Yu, Doris S F

    Atypical symptom presentation in patients with acute myocardial infarction (AMI) is associated with longer delay in care seeking and poorer prognosis. Symptom recognition in these patients is a challenging task. Our purpose in this risk prediction model development study was to develop and validate a risk scoring system for estimating cumulative risk for atypical AMI presentation. A consecutive sample was recruited for the developmental (n = 300) and validation (n = 97) cohorts. Symptom experience was measured with the validated Chinese version of the Symptoms of Acute Coronary Syndromes Inventory. Potential predictors were identified from the literature. Multivariable logistic regression was performed to identify significant predictors. A risk scoring system was then constructed by assigning weights to each significant predictor according to their b coefficients. Five independent predictors for atypical symptom presentation were older age (≥75 years), female gender, diabetes mellitus, history of AMI, and absence of hyperlipidemia. The Hosmer and Lemeshow test (χ6 = 4.47, P = .62) indicated that this predictive model was adequate to predict the outcome. Acceptable discrimination was demonstrated, with area under the receiver operating characteristic curve as 0.74 (95% confidence interval, 0.67-0.82) (P < .001). The predictive power of this risk scoring system was confirmed in the validation cohort. Atypical AMI presentation is common. A simple risk scoring system developed on the basis of the 5 identified predictors can raise awareness of atypical AMI presentation and promote symptom recognition by estimating the cumulative risk for an individual to present with atypical AMI symptoms.

  19. Population pharmacokinetics of tacrolimus in paediatric systemic lupus erythematosus based on real-world study.

    PubMed

    Wang, D-D; Lu, J-M; Li, Q; Li, Z-P

    2018-05-15

    Different population pharmacokinetics (PPK) models of tacrolimus have been established in various populations. However, the tacrolimus PPK model in paediatric systemic lupus erythematosus (PSLE) is still undefined. This study aimed to establish the tacrolimus PPK model in Chinese PSLE. A total of nineteen Chinese patients with PSLE from real-world study were characterized with nonlinear mixed-effects modelling (NONMEM). The impact of demographic features, biological characteristics, and concomitant medications was evaluated. Model validation was assessed by bootstrap and prediction-corrected visual predictive check (VPC). A one-compartment model with first-order absorption and elimination was determined to be the most suitable model in PSLE. The typical values of apparent oral clearance (CL/F) and the apparent volume of distribution (V/F) in the final model were 2.05 L/h and 309 L, respectively. Methylprednisolone and simvastatin were included as significant. The first validated tacrolimus PPK model in patients with PSLE is presented. © 2018 John Wiley & Sons Ltd.

  20. Validation of a reduced-order jet model for subsonic and underexpanded hydrogen jets

    DOE PAGES

    Li, Xuefang; Hecht, Ethan S.; Christopher, David M.

    2016-01-01

    Much effort has been made to model hydrogen releases from leaks during potential failures of hydrogen storage systems. A reduced-order jet model can be used to quickly characterize these flows, with low computational cost. Notional nozzle models are often used to avoid modeling the complex shock structures produced by the underexpanded jets by determining an “effective” source to produce the observed downstream trends. In our work, the mean hydrogen concentration fields were measured in a series of subsonic and underexpanded jets using a planar laser Rayleigh scattering system. Furthermore, we compared the experimental data to a reduced order jet modelmore » for subsonic flows and a notional nozzle model coupled to the jet model for underexpanded jets. The values of some key model parameters were determined by comparisons with the experimental data. Finally, the coupled model was also validated against hydrogen concentrations measurements for 100 and 200 bar hydrogen jets with the predictions agreeing well with data in the literature.« less

  1. Priming nanoparticle-guided diagnostics and therapeutics towards human organs-on-chips microphysiological system

    NASA Astrophysics Data System (ADS)

    Choi, Jin-Ha; Lee, Jaewon; Shin, Woojung; Choi, Jeong-Woo; Kim, Hyun Jung

    2016-10-01

    Nanotechnology and bioengineering have converged over the past decades, by which the application of multi-functional nanoparticles (NPs) has been emerged in clinical and biomedical fields. The NPs primed to detect disease-specific biomarkers or to deliver biopharmaceutical compounds have beena validated in conventional in vitro culture models including two dimensional (2D) cell cultures or 3D organoid models. However, a lack of experimental models that have strong human physiological relevance has hampered accurate validation of the safety and functionality of NPs. Alternatively, biomimetic human "Organs-on-Chips" microphysiological systems have recapitulated the mechanically dynamic 3D tissue interface of human organ microenvironment, in which the transport, cytotoxicity, biocompatibility, and therapeutic efficacy of NPs and their conjugates may be more accurately validated. Finally, integration of NP-guided diagnostic detection and targeted nanotherapeutics in conjunction with human organs-on-chips can provide a novel avenue to accelerate the NP-based drug development process as well as the rapid detection of cellular secretomes associated with pathophysiological processes.

  2. PIV Measurements of the CEV Hot Abort Motor Plume for CFD Validation

    NASA Technical Reports Server (NTRS)

    Wernet, Mark; Wolter, John D.; Locke, Randy; Wroblewski, Adam; Childs, Robert; Nelson, Andrea

    2010-01-01

    NASA s next manned launch platform for missions to the moon and Mars are the Orion and Ares systems. Many critical aspects of the launch system performance are being verified using computational fluid dynamics (CFD) predictions. The Orion Launch Abort Vehicle (LAV) consists of a tower mounted tractor rocket tasked with carrying the Crew Module (CM) safely away from the launch vehicle in the event of a catastrophic failure during the vehicle s ascent. Some of the predictions involving the launch abort system flow fields produced conflicting results, which required further investigation through ground test experiments. Ground tests were performed to acquire data from a hot supersonic jet in cross-flow for the purpose of validating CFD turbulence modeling relevant to the Orion Launch Abort Vehicle (LAV). Both 2-component axial plane Particle Image Velocimetry (PIV) and 3-component cross-stream Stereo Particle Image Velocimetry (SPIV) measurements were obtained on a model of an Abort Motor (AM). Actual flight conditions could not be simulated on the ground, so the highest temperature and pressure conditions that could be safely used in the test facility (nozzle pressure ratio 28.5 and a nozzle temperature ratio of 3) were used for the validation tests. These conditions are significantly different from those of the flight vehicle, but were sufficiently high enough to begin addressing turbulence modeling issues that predicated the need for the validation tests.

  3. IEEE 1982. Proceedings of the international conference on cybernetics and society

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1982-01-01

    The following topics were dealt with: knowledge-based systems; risk analysis; man-machine interactions; human information processing; metaphor, analogy and problem-solving; manual control modelling; transportation systems; simulation; adaptive and learning systems; biocybernetics; cybernetics; mathematical programming; robotics; decision support systems; analysis, design and validation of models; computer vision; systems science; energy systems; environmental modelling and policy; pattern recognition; nuclear warfare; technological forecasting; artificial intelligence; the Turin shroud; optimisation; workloads. Abstracts of individual papers can be found under the relevant classification codes in this or future issues.

  4. A scoring system for ascertainment of incident stroke; the Risk Index Score (RISc).

    PubMed

    Kass-Hout, T A; Moyé, L A; Smith, M A; Morgenstern, L B

    2006-01-01

    The main objective of this study was to develop and validate a computer-based statistical algorithm that could be translated into a simple scoring system in order to ascertain incident stroke cases using hospital admission medical records data. The Risk Index Score (RISc) algorithm was developed using data collected prospectively by the Brain Attack Surveillance in Corpus Christi (BASIC) project, 2000. The validity of RISc was evaluated by estimating the concordance of scoring system stroke ascertainment to stroke ascertainment by physician and/or abstractor review of hospital admission records. RISc was developed on 1718 randomly selected patients (training set) and then statistically validated on an independent sample of 858 patients (validation set). A multivariable logistic model was used to develop RISc and subsequently evaluated by goodness-of-fit and receiver operating characteristic (ROC) analyses. The higher the value of RISc, the higher the patient's risk of potential stroke. The study showed RISc was well calibrated and discriminated those who had potential stroke from those that did not on initial screening. In this study we developed and validated a rapid, easy, efficient, and accurate method to ascertain incident stroke cases from routine hospital admission records for epidemiologic investigations. Validation of this scoring system was achieved statistically; however, clinical validation in a community hospital setting is warranted.

  5. An empirical model of diagnostic x-ray attenuation under narrow-beam geometry.

    PubMed

    Mathieu, Kelsey B; Kappadath, S Cheenu; White, R Allen; Atkinson, E Neely; Cody, Dianna D

    2011-08-01

    The purpose of this study was to develop and validate a mathematical model to describe narrow-beam attenuation of kilovoltage x-ray beams for the intended applications of half-value layer (HVL) and quarter-value layer (QVL) estimations, patient organ shielding, and computer modeling. An empirical model, which uses the Lambert W function and represents a generalized Lambert-Beer law, was developed. To validate this model, transmission of diagnostic energy x-ray beams was measured over a wide range of attenuator thicknesses [0.49-33.03 mm Al on a computed tomography (CT) scanner, 0.09-1.93 mm Al on two mammography systems, and 0.1-0.45 mm Cu and 0.49-14.87 mm Al using general radiography]. Exposure measurements were acquired under narrow-beam geometry using standard methods, including the appropriate ionization chamber, for each radiographic system. Nonlinear regression was used to find the best-fit curve of the proposed Lambert W model to each measured transmission versus attenuator thickness data set. In addition to validating the Lambert W model, we also assessed the performance of two-point Lambert W interpolation compared to traditional methods for estimating the HVL and QVL [i.e., semi-logarithmic (exponential) and linear interpolation]. The Lambert W model was validated for modeling attenuation versus attenuator thickness with respect to the data collected in this study (R2 > 0.99). Furthermore, Lambert W interpolation was more accurate and less sensitive to the choice of interpolation points used to estimate the HVL and/or QVL than the traditional methods of semilogarithmic and linear interpolation. The proposed Lambert W model accurately describes attenuation of both monoenergetic radiation and (kilovoltage) polyenergetic beams (under narrow-beam geometry).

  6. An empirical model of diagnostic x-ray attenuation under narrow-beam geometry

    PubMed Central

    Mathieu, Kelsey B.; Kappadath, S. Cheenu; White, R. Allen; Atkinson, E. Neely; Cody, Dianna D.

    2011-01-01

    Purpose: The purpose of this study was to develop and validate a mathematical model to describe narrow-beam attenuation of kilovoltage x-ray beams for the intended applications of half-value layer (HVL) and quarter-value layer (QVL) estimations, patient organ shielding, and computer modeling. Methods: An empirical model, which uses the Lambert W function and represents a generalized Lambert-Beer law, was developed. To validate this model, transmission of diagnostic energy x-ray beams was measured over a wide range of attenuator thicknesses [0.49–33.03 mm Al on a computed tomography (CT) scanner, 0.09–1.93 mm Al on two mammography systems, and 0.1–0.45 mm Cu and 0.49–14.87 mm Al using general radiography]. Exposure measurements were acquired under narrow-beam geometry using standard methods, including the appropriate ionization chamber, for each radiographic system. Nonlinear regression was used to find the best-fit curve of the proposed Lambert W model to each measured transmission versus attenuator thickness data set. In addition to validating the Lambert W model, we also assessed the performance of two-point Lambert W interpolation compared to traditional methods for estimating the HVL and QVL [i.e., semilogarithmic (exponential) and linear interpolation]. Results: The Lambert W model was validated for modeling attenuation versus attenuator thickness with respect to the data collected in this study (R2 > 0.99). Furthermore, Lambert W interpolation was more accurate and less sensitive to the choice of interpolation points used to estimate the HVL and∕or QVL than the traditional methods of semilogarithmic and linear interpolation. Conclusions: The proposed Lambert W model accurately describes attenuation of both monoenergetic radiation and (kilovoltage) polyenergetic beams (under narrow-beam geometry). PMID:21928626

  7. An empirical model of diagnostic x-ray attenuation under narrow-beam geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathieu, Kelsey B.; Kappadath, S. Cheenu; White, R. Allen

    2011-08-15

    Purpose: The purpose of this study was to develop and validate a mathematical model to describe narrow-beam attenuation of kilovoltage x-ray beams for the intended applications of half-value layer (HVL) and quarter-value layer (QVL) estimations, patient organ shielding, and computer modeling. Methods: An empirical model, which uses the Lambert W function and represents a generalized Lambert-Beer law, was developed. To validate this model, transmission of diagnostic energy x-ray beams was measured over a wide range of attenuator thicknesses [0.49-33.03 mm Al on a computed tomography (CT) scanner, 0.09-1.93 mm Al on two mammography systems, and 0.1-0.45 mm Cu and 0.49-14.87more » mm Al using general radiography]. Exposure measurements were acquired under narrow-beam geometry using standard methods, including the appropriate ionization chamber, for each radiographic system. Nonlinear regression was used to find the best-fit curve of the proposed Lambert W model to each measured transmission versus attenuator thickness data set. In addition to validating the Lambert W model, we also assessed the performance of two-point Lambert W interpolation compared to traditional methods for estimating the HVL and QVL [i.e., semilogarithmic (exponential) and linear interpolation]. Results: The Lambert W model was validated for modeling attenuation versus attenuator thickness with respect to the data collected in this study (R{sup 2} > 0.99). Furthermore, Lambert W interpolation was more accurate and less sensitive to the choice of interpolation points used to estimate the HVL and/or QVL than the traditional methods of semilogarithmic and linear interpolation. Conclusions: The proposed Lambert W model accurately describes attenuation of both monoenergetic radiation and (kilovoltage) polyenergetic beams (under narrow-beam geometry).« less

  8. Context-Aware Mobile Collaborative Systems: Conceptual Modeling and Case Study

    PubMed Central

    Benítez-Guerrero, Edgard; Mezura-Godoy, Carmen; Montané-Jiménez, Luis G.

    2012-01-01

    A Mobile Collaborative System (MCOS) enable the cooperation of the members of a team to achieve a common goal by using a combination of mobile and fixed technologies. MCOS can be enhanced if the context of the group of users is considered in the execution of activities. This paper proposes a novel model for Context-Aware Mobile COllaborative Systems (CAMCOS) and a functional architecture based on that model. In order to validate both the model and the architecture, a prototype system in the tourism domain was implemented and evaluated. PMID:23202007

  9. Validation of a Monte Carlo simulation of the Inveon PET scanner using GATE

    NASA Astrophysics Data System (ADS)

    Lu, Lijun; Zhang, Houjin; Bian, Zhaoying; Ma, Jianhua; Feng, Qiangjin; Chen, Wufan

    2016-08-01

    The purpose of this study is to validate the application of GATE (Geant4 Application for Tomographic Emission) Monte Carlo simulation toolkit in order to model the performance characteristics of Siemens Inveon small animal PET system. The simulation results were validated against experimental/published data in accordance with the NEMA NU-4 2008 protocol for standardized evaluation of spatial resolution, sensitivity, scatter fraction (SF) and noise equivalent counting rate (NECR) of a preclinical PET system. An agreement of less than 18% was obtained between the radial, tangential and axial spatial resolutions of the simulated and experimental results. The simulated peak NECR of mouse-size phantom agreed with the experimental result, while for the rat-size phantom simulated value was higher than experimental result. The simulated and experimental SFs of mouse- and rat- size phantom both reached an agreement of less than 2%. It has been shown the feasibility of our GATE model to accurately simulate, within certain limits, all major performance characteristics of Inveon PET system.

  10. Model selection and assessment for multi­-species occupancy models

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.

    2016-01-01

    While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.

  11. Discrimination of fish populations using parasites: Random Forests on a 'predictable' host-parasite system.

    PubMed

    Pérez-Del-Olmo, A; Montero, F E; Fernández, M; Barrett, J; Raga, J A; Kostadinova, A

    2010-10-01

    We address the effect of spatial scale and temporal variation on model generality when forming predictive models for fish assignment using a new data mining approach, Random Forests (RF), to variable biological markers (parasite community data). Models were implemented for a fish host-parasite system sampled along the Mediterranean and Atlantic coasts of Spain and were validated using independent datasets. We considered 2 basic classification problems in evaluating the importance of variations in parasite infracommunities for assignment of individual fish to their populations of origin: multiclass (2-5 population models, using 2 seasonal replicates from each of the populations) and 2-class task (using 4 seasonal replicates from 1 Atlantic and 1 Mediterranean population each). The main results are that (i) RF are well suited for multiclass population assignment using parasite communities in non-migratory fish; (ii) RF provide an efficient means for model cross-validation on the baseline data and this allows sample size limitations in parasite tag studies to be tackled effectively; (iii) the performance of RF is dependent on the complexity and spatial extent/configuration of the problem; and (iv) the development of predictive models is strongly influenced by seasonal change and this stresses the importance of both temporal replication and model validation in parasite tagging studies.

  12. Frequency Response Function Based Damage Identification for Aerospace Structures

    NASA Astrophysics Data System (ADS)

    Oliver, Joseph Acton

    Structural health monitoring technologies continue to be pursued for aerospace structures in the interests of increased safety and, when combined with health prognosis, efficiency in life-cycle management. The current dissertation develops and validates damage identification technology as a critical component for structural health monitoring of aerospace structures and, in particular, composite unmanned aerial vehicles. The primary innovation is a statistical least-squares damage identification algorithm based in concepts of parameter estimation and model update. The algorithm uses frequency response function based residual force vectors derived from distributed vibration measurements to update a structural finite element model through statistically weighted least-squares minimization producing location and quantification of the damage, estimation uncertainty, and an updated model. Advantages compared to other approaches include robust applicability to systems which are heavily damped, large, and noisy, with a relatively low number of distributed measurement points compared to the number of analytical degrees-of-freedom of an associated analytical structural model (e.g., modal finite element model). Motivation, research objectives, and a dissertation summary are discussed in Chapter 1 followed by a literature review in Chapter 2. Chapter 3 gives background theory and the damage identification algorithm derivation followed by a study of fundamental algorithm behavior on a two degree-of-freedom mass-spring system with generalized damping. Chapter 4 investigates the impact of noise then successfully proves the algorithm against competing methods using an analytical eight degree-of-freedom mass-spring system with non-proportional structural damping. Chapter 5 extends use of the algorithm to finite element models, including solutions for numerical issues, approaches for modeling damping approximately in reduced coordinates, and analytical validation using a composite sandwich plate model. Chapter 6 presents the final extension to experimental systems-including methods for initial baseline correlation and data reduction-and validates the algorithm on an experimental composite plate with impact damage. The final chapter deviates from development and validation of the primary algorithm to discuss development of an experimental scaled-wing test bed as part of a collaborative effort for developing structural health monitoring and prognosis technology. The dissertation concludes with an overview of technical conclusions and recommendations for future work.

  13. Acoustic-Structure Interaction in Rocket Engines: Validation Testing

    NASA Technical Reports Server (NTRS)

    Davis, R. Benjamin; Joji, Scott S.; Parks, Russel A.; Brown, Andrew M.

    2009-01-01

    While analyzing a rocket engine component, it is often necessary to account for any effects that adjacent fluids (e.g., liquid fuels or oxidizers) might have on the structural dynamics of the component. To better characterize the fully coupled fluid-structure system responses, an analytical approach that models the system as a coupled expansion of rigid wall acoustic modes and in vacuo structural modes has been proposed. The present work seeks to experimentally validate this approach. To experimentally observe well-coupled system modes, the test article and fluid cavities are designed such that the uncoupled structural frequencies are comparable to the uncoupled acoustic frequencies. The test measures the natural frequencies, mode shapes, and forced response of cylindrical test articles in contact with fluid-filled cylindrical and/or annular cavities. The test article is excited with a stinger and the fluid-loaded response is acquired using a laser-doppler vibrometer. The experimentally determined fluid-loaded natural frequencies are compared directly to the results of the analytical model. Due to the geometric configuration of the test article, the analytical model is found to be valid for natural modes with circumferential wave numbers greater than four. In the case of these modes, the natural frequencies predicted by the analytical model demonstrate excellent agreement with the experimentally determined natural frequencies.

  14. International Space Station Modal Correction Analysis

    NASA Technical Reports Server (NTRS)

    Fotz[atrocl. Lrostom; Grugoer. < ocjae; Laible, Michael; Sugavanam, Sujatha

    2012-01-01

    This paper summarizes the on-orbit modal test and the related modal analysis, model validation and correlation performed for the ISS Stage ULF4, DTF S4-1A, October 11,2010, GMT 284/06:13:00.00. The objective of this analysis is to validate and correlate analytical models with the intent to verify the ISS critical interface dynamic loads and improve fatigue life prediction. For the ISS configurations under consideration, on-orbit dynamic responses were collected with Russian vehicles attached and without the Orbiter attached to the ISS. ISS instrumentation systems that were used to collect the dynamic responses during the DTF S4-1A included the Internal Wireless Instrumentation System (IWIS), External Wireless Instrumentation System (EWIS), Structural Dynamic Measurement System (SDMS), Space Acceleration Measurement System (SAMS), Inertial Measurement Unit (IMU) and ISS External Cameras. Experimental modal analyses were performed on the measured data to extract modal parameters including frequency, damping and mode shape information. Correlation and comparisons between test and analytical modal parameters were performed to assess the accuracy of models for the ISS configuration under consideration. Based on the frequency comparisons, the accuracy of the mathematical models is assessed and model refinement recommendations are given. Section 2.0 of this report presents the math model used in the analysis. This section also describes the ISS configuration under consideration and summarizes the associated primary modes of interest along with the fundamental appendage modes. Section 3.0 discusses the details of the ISS Stage ULF4 DTF S4-1A test. Section 4.0 discusses the on-orbit instrumentation systems that were used in the collection of the data analyzed in this paper. The modal analysis approach and results used in the analysis of the collected data are summarized in Section 5.0. The model correlation and validation effort is reported in Section 6.0. Conclusions and recommendations drawn from this analysis are included in Section 7.0.

  15. Assessing Online Textual Feedback to Support Student Intrinsic Motivation Using a Collaborative Text-Based Dialogue System: A Qualitative Study

    ERIC Educational Resources Information Center

    Shroff, Ronnie H.; Deneen, Christopher

    2011-01-01

    This paper assesses textual feedback to support student intrinsic motivation using a collaborative text-based dialogue system. A research model is presented based on research into intrinsic motivation, and the specific construct of feedback provides a framework for the model. A qualitative research methodology is used to validate the model.…

  16. Calibration and validation of a voxel phantom for use in the Monte Carlo modeling and optimization of x-ray imaging systems

    NASA Astrophysics Data System (ADS)

    Dance, David R.; McVey, Graham; Sandborg, Michael P.; Persliden, Jan; Carlsson, Gudrun A.

    1999-05-01

    A Monte Carlo program has been developed to model X-ray imaging systems. It incorporates an adult voxel phantom and includes anti-scatter grid, radiographic screen and film. The program can calculate contrast and noise for a series of anatomical details. The use of measured H and D curves allows the absolute calculation of the patient entrance air kerma for a given film optical density (or vice versa). Effective dose can also be estimated. In an initial validation, the program was used to predict the optical density for exposures with plastic slabs of various thicknesses. The agreement between measurement and calculation was on average within 5%. In a second validation, a comparison was made between computer simulations and measurements for chest and lumbar spine patient radiographs. The predictions of entrance air kerma mostly fell within the range of measured values (e.g. chest PA calculated 0.15 mGy, measured 0.12 - 0.17 mGy). Good agreement was also obtained for the calculated and measured contrasts for selected anatomical details and acceptable agreement for dynamic range. It is concluded that the program provides a realistic model of the patient and imaging system. It can thus form the basis of a detailed study and optimization of X-ray imaging systems.

  17. Model-Based Method for Sensor Validation

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  18. [Support of the nursing process through electronic nursing documentation systems (UEPD) – Initial validation of an instrument].

    PubMed

    Hediger, Hannele; Müller-Staub, Maria; Petry, Heidi

    2016-01-01

    Electronic nursing documentation systems, with standardized nursing terminology, are IT-based systems for recording the nursing processes. These systems have the potential to improve the documentation of the nursing process and to support nurses in care delivery. This article describes the development and initial validation of an instrument (known by its German acronym UEPD) to measure the subjectively-perceived benefits of an electronic nursing documentation system in care delivery. The validity of the UEPD was examined by means of an evaluation study carried out in an acute care hospital (n = 94 nurses) in German-speaking Switzerland. Construct validity was analyzed by principal components analysis. Initial references of validity of the UEPD could be verified. The analysis showed a stable four factor model (FS = 0.89) scoring in 25 items. All factors loaded ≥ 0.50 and the scales demonstrated high internal consistency (Cronbach's α = 0.73 – 0.90). Principal component analysis revealed four dimensions of support: establishing nursing diagnosis and goals; recording a case history/an assessment and documenting the nursing process; implementation and evaluation as well as information exchange. Further testing with larger control samples and with different electronic documentation systems are needed. Another potential direction would be to employ the UEPD in a comparison of various electronic documentation systems.

  19. Dragon pulse information management system (DPIMS): A unique model-based approach to implementing domain agnostic system of systems and behaviors

    NASA Astrophysics Data System (ADS)

    Anderson, Thomas S.

    2016-05-01

    The Global Information Network Architecture is an information technology based on Vector Relational Data Modeling, a unique computational paradigm, DoD network certified by USARMY as the Dragon Pulse Informa- tion Management System. This network available modeling environment for modeling models, where models are configured using domain relevant semantics and use network available systems, sensors, databases and services as loosely coupled component objects and are executable applications. Solutions are based on mission tactics, techniques, and procedures and subject matter input. Three recent ARMY use cases are discussed a) ISR SoS. b) Modeling and simulation behavior validation. c) Networked digital library with behaviors.

  20. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    NASA Astrophysics Data System (ADS)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  1. Modelling the Success of Learning Management Systems: Application of Latent Class Segmentation Using FIMIX-PLS

    ERIC Educational Resources Information Center

    Arenas-Gaitán, Jorge; Rondán-Cataluña, Francisco Javier; Ramírez-Correa, Patricio E.

    2018-01-01

    There is not a unique attitude towards the implementation of digital technology in educational sceneries. This paper aims to validate an adaptation of the DeLone and McLean information systems success model in the context of a learning management system. Furthermore, this study means to prove (1) the necessity of segmenting students in order to…

  2. Development and Validation of EPH Material Model for Engineered Roadway Soil

    DTIC Science & Technology

    2014-08-01

    MODEL FOR ENGINEERED ROADWAY SOIL Ching Hsieh, PhD Altair Engineering Troy, MI Jianping Sheng, Ph.D. Jai Ramalingam System Engineering...0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...of information if it does not display a currently valid OMB control number. 1 . REPORT DATE 11 AUG 2014 2. REPORT TYPE Journal Article 3. DATES

  3. ARMAX-Based Transfer Function Model Identification Using Wide-Area Measurement for Adaptive and Coordinated Damping Control

    DOE PAGES

    Liu, Hesen; Zhu, Lin; Pan, Zhuohong; ...

    2015-09-14

    One of the main drawbacks of the existing oscillation damping controllers that are designed based on offline dynamic models is adaptivity to the power system operating condition. With the increasing availability of wide-area measurements and the rapid development of system identification techniques, it is possible to identify a measurement-based transfer function model online that can be used to tune the oscillation damping controller. Such a model could capture all dominant oscillation modes for adaptive and coordinated oscillation damping control. our paper describes a comprehensive approach to identify a low-order transfer function model of a power system using a multi-input multi-outputmore » (MIMO) autoregressive moving average exogenous (ARMAX) model. This methodology consists of five steps: 1) input selection; 2) output selection; 3) identification trigger; 4) model estimation; and 5) model validation. The proposed method is validated by using ambient data and ring-down data in the 16-machine 68-bus Northeast Power Coordinating Council system. Our results demonstrate that the measurement-based model using MIMO ARMAX can capture all the dominant oscillation modes. Compared with the MIMO subspace state space model, the MIMO ARMAX model has equivalent accuracy but lower order and improved computational efficiency. The proposed model can be applied for adaptive and coordinated oscillation damping control.« less

  4. Evaluating Emulation-based Models of Distributed Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Stephen T.; Gabert, Kasimir G.; Tarman, Thomas D.

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses andmore » describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.« less

  5. Development and validation of a regional coupled forecasting system for S2S forecasts

    NASA Astrophysics Data System (ADS)

    Sun, R.; Subramanian, A. C.; Hoteit, I.; Miller, A. J.; Ralph, M.; Cornuelle, B. D.

    2017-12-01

    Accurate and efficient forecasting of oceanic and atmospheric circulation is essential for a wide variety of high-impact societal needs, including: weather extremes; environmental protection and coastal management; management of fisheries, marine conservation; water resources; and renewable energy. Effective forecasting relies on high model fidelity and accurate initialization of the models with observed state of the ocean-atmosphere-land coupled system. A regional coupled ocean-atmosphere model with the Weather Research and Forecasting (WRF) model and the MITGCM ocean model coupled using the ESMF (Earth System Modeling Framework) coupling framework is developed to resolve mesoscale air-sea feedbacks. The regional coupled model allows oceanic mixed layer heat and momentum to interact with the atmospheric boundary layer dynamics at the mesoscale and submesoscale spatiotemporal regimes, thus leading to feedbacks which are otherwise not resolved in coarse resolution global coupled forecasting systems or regional uncoupled forecasting systems. The model is tested in two scenarios in the mesoscale eddy rich Red Sea and Western Indian Ocean region as well as mesoscale eddies and fronts of the California Current System. Recent studies show evidence for air-sea interactions involving the oceanic mesoscale in these two regions which can enhance predictability on sub seasonal timescale. We will present results from this newly developed regional coupled ocean-atmosphere model for forecasts over the Red Sea region as well as the California Current region. The forecasts will be validated against insitu observations in the region as well as reanalysis fields.

  6. Dynamic modelling of a double-pendulum gantry crane system incorporating payload

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ismail, R. M. T. Raja; Ahmad, M. A.; Ramli, M. S.

    The natural sway of crane payloads is detrimental to safe and efficient operation. Under certain conditions, the problem is complicated when the payloads create a double pendulum effect. This paper presents dynamic modelling of a double-pendulum gantry crane system based on closed-form equations of motion. The Lagrangian method is used to derive the dynamic model of the system. A dynamic model of the system incorporating payload is developed and the effects of payload on the response of the system are discussed. Extensive results that validate the theoretical derivation are presented in the time and frequency domains.

  7. Developing a theoretical model and questionnaire survey instrument to measure the success of electronic health records in residential aged care

    PubMed Central

    Yu, Ping; Qian, Siyu

    2018-01-01

    Electronic health records (EHR) are introduced into healthcare organizations worldwide to improve patient safety, healthcare quality and efficiency. A rigorous evaluation of this technology is important to reduce potential negative effects on patient and staff, to provide decision makers with accurate information for system improvement and to ensure return on investment. Therefore, this study develops a theoretical model and questionnaire survey instrument to assess the success of organizational EHR in routine use from the viewpoint of nursing staff in residential aged care homes. The proposed research model incorporates six variables in the reformulated DeLone and McLean information systems success model: system quality, information quality, service quality, use, user satisfaction and net benefits. Two variables training and self-efficacy were also incorporated into the model. A questionnaire survey instrument was designed to measure the eight variables in the model. After a pilot test, the measurement scale was used to collect data from 243 nursing staff members in 10 residential aged care homes belonging to three management groups in Australia. Partial least squares path modeling was conducted to validate the model. The validated EHR systems success model predicts the impact of the four antecedent variables—training, self-efficacy, system quality and information quality—on the net benefits, the indicator of EHR systems success, through the intermittent variables use and user satisfaction. A 24-item measurement scale was developed to quantitatively evaluate the performance of an EHR system. The parsimonious EHR systems success model and the measurement scale can be used to benchmark EHR systems success across organizations and units and over time. PMID:29315323

  8. Developing a theoretical model and questionnaire survey instrument to measure the success of electronic health records in residential aged care.

    PubMed

    Yu, Ping; Qian, Siyu

    2018-01-01

    Electronic health records (EHR) are introduced into healthcare organizations worldwide to improve patient safety, healthcare quality and efficiency. A rigorous evaluation of this technology is important to reduce potential negative effects on patient and staff, to provide decision makers with accurate information for system improvement and to ensure return on investment. Therefore, this study develops a theoretical model and questionnaire survey instrument to assess the success of organizational EHR in routine use from the viewpoint of nursing staff in residential aged care homes. The proposed research model incorporates six variables in the reformulated DeLone and McLean information systems success model: system quality, information quality, service quality, use, user satisfaction and net benefits. Two variables training and self-efficacy were also incorporated into the model. A questionnaire survey instrument was designed to measure the eight variables in the model. After a pilot test, the measurement scale was used to collect data from 243 nursing staff members in 10 residential aged care homes belonging to three management groups in Australia. Partial least squares path modeling was conducted to validate the model. The validated EHR systems success model predicts the impact of the four antecedent variables-training, self-efficacy, system quality and information quality-on the net benefits, the indicator of EHR systems success, through the intermittent variables use and user satisfaction. A 24-item measurement scale was developed to quantitatively evaluate the performance of an EHR system. The parsimonious EHR systems success model and the measurement scale can be used to benchmark EHR systems success across organizations and units and over time.

  9. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  10. Validation of Slosh Modeling Approach Using STAR-CCM+

    NASA Technical Reports Server (NTRS)

    Benson, David J.; Ng, Wanyi

    2018-01-01

    Without an adequate understanding of propellant slosh, the spacecraft attitude control system may be inadequate to control the spacecraft or there may be an unexpected loss of science observation time due to higher slosh settling times. Computational fluid dynamics (CFD) is used to model propellant slosh. STAR-CCM+ is a commercially available CFD code. This paper seeks to validate the CFD modeling approach via a comparison between STAR-CCM+ liquid slosh modeling results and experimental, empirically, and analytically derived results. The geometries examined are a bare right cylinder tank and a right cylinder with a single ring baffle.

  11. Current progress in patient-specific modeling

    PubMed Central

    2010-01-01

    We present a survey of recent advancements in the emerging field of patient-specific modeling (PSM). Researchers in this field are currently simulating a wide variety of tissue and organ dynamics to address challenges in various clinical domains. The majority of this research employs three-dimensional, image-based modeling techniques. Recent PSM publications mostly represent feasibility or preliminary validation studies on modeling technologies, and these systems will require further clinical validation and usability testing before they can become a standard of care. We anticipate that with further testing and research, PSM-derived technologies will eventually become valuable, versatile clinical tools. PMID:19955236

  12. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  13. Modeling Framework and Validation of a Smart Grid and Demand Response System for Wind Power Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Broeer, Torsten; Fuller, Jason C.; Tuffner, Francis K.

    2014-01-31

    Electricity generation from wind power and other renewable energy sources is increasing, and their variability introduces new challenges to the power system. The emergence of smart grid technologies in recent years has seen a paradigm shift in redefining the electrical system of the future, in which controlled response of the demand side is used to balance fluctuations and intermittencies from the generation side. This paper presents a modeling framework for an integrated electricity system where loads become an additional resource. The agent-based model represents a smart grid power system integrating generators, transmission, distribution, loads and market. The model incorporates generatormore » and load controllers, allowing suppliers and demanders to bid into a Real-Time Pricing (RTP) electricity market. The modeling framework is applied to represent a physical demonstration project conducted on the Olympic Peninsula, Washington, USA, and validation simulations are performed using actual dynamic data. Wind power is then introduced into the power generation mix illustrating the potential of demand response to mitigate the impact of wind power variability, primarily through thermostatically controlled loads. The results also indicate that effective implementation of Demand Response (DR) to assist integration of variable renewable energy resources requires a diversity of loads to ensure functionality of the overall system.« less

  14. Modelling a hydropower plant with reservoir with the micropower optimisation model (HOMER)

    NASA Astrophysics Data System (ADS)

    Canales, Fausto A.; Beluco, Alexandre; Mendes, Carlos André B.

    2017-08-01

    Hydropower with water accumulation is an interesting option to consider in hybrid systems, because it helps dealing with the intermittence characteristics of renewable energy resources. The software HOMER (version Legacy) is extensively used in research works related to these systems, but it does not include a specific option for modelling hydro with reservoir. This paper describes a method for modelling a hydropower plant with reservoir with HOMER by adapting an existing procedure used for modelling pumped storage. An example with two scenarios in southern Brazil is presented for illustrating and validating the method explained in this paper. The results validate the method by showing a direct correspondence between an equivalent battery and the reservoir. The refill of the reservoir, its power output as a function of the flow rate and installed hydropower capacity are effectively simulated, indicating an adequate representation of a hydropower plant with reservoir is possible with HOMER.

  15. Load Composition Model Workflow (BPA TIP-371 Deliverable 1A)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Cezar, Gustavo V.

    This project is funded under Bonneville Power Administration (BPA) Strategic Partnership Project (SPP) 17-005 between BPA and SLAC National Accelerator Laboratory. The project in a BPA Technology Improvement Project (TIP) that builds on and validates the Composite Load Model developed by the Western Electric Coordinating Council's (WECC) Load Modeling Task Force (LMTF). The composite load model is used by the WECC Modeling and Validation Work Group to study the stability and security of the western electricity interconnection. The work includes development of load composition data sets, collection of load disturbance data, and model development and validation. This work supports reliablemore » and economic operation of the power system. This report was produced for Deliverable 1A of the BPA TIP-371 Project entitled \\TIP 371: Advancing the Load Composition Model". The deliverable documents the proposed work ow for the Composite Load Model, which provides the basis for the instrumentation, data acquisition, analysis and data dissemination activities addressed by later phases of the project.« less

  16. Comparison of Biophysical Characteristics and Predicted Thermophysiological Responses of Three Prototype Body Armor Systems Versus Baseline U.S. Army Body Armor Systems

    DTIC Science & Technology

    2015-06-19

    effective and scientifically valid method of making comparisons of clothing and equipment changes prior to conducting human research. predictive modeling...valid method of making comparisons of clothing and equipment changes prior to conducting human research. 2 INTRODUCTION Modern day...clothing and equipment changes prior to conducting human research. METHODS Ensembles Three different body armor (BA) plus clothing ensembles were

  17. WFIRST: Coronagraph Systems Engineering and Performance Budgets

    NASA Astrophysics Data System (ADS)

    Poberezhskiy, Ilya; cady, eric; Frerking, Margaret A.; Kern, Brian; Nemati, Bijan; Noecker, Martin; Seo, Byoung-Joon; Zhao, Feng; Zhou, Hanying

    2018-01-01

    The WFIRST coronagraph instrument (CGI) will be the first in-space coronagraph using active wavefront control to directly image and characterize mature exoplanets and zodiacal disks in reflected starlight. For CGI systems engineering, including requirements development, CGI performance is predicted using a hierarchy of performance budgets to estimate various noise components — spatial and temporal flux variations — that obscure exoplanet signals in direct imaging and spectroscopy configurations. These performance budgets are validated through a robust integrated modeling and testbed model validation efforts.We present the performance budgeting framework used by WFIRST for the flow-down of coronagraph science requirements, mission constraints, and observatory interfaces to measurable instrument engineering parameters.

  18. A relationship between peak temperature drop and velocity differential in a microburst

    NASA Technical Reports Server (NTRS)

    Proctor, Fred H.

    1989-01-01

    Results from numerical microburst simulations using the Terminal Area Simulation System (Proctor, 1987) are used to develop a relationship between wind velocity differential and peak temperature drop. The numerical model and the relationships derived from the model are described. The relationship between peak temperature drop and differential wind velocity is shown to be valid during microburst development, for all precipitation shaft intensities and diameters. It is found that the relationship is not valid for low-reflectivity microburst events or in the presence of ground-based stable layers. The use of the relationship in IR wind shear detection systems is considered.

  19. 6DOF Testing of the SLS Inertial Navigation Unit

    NASA Technical Reports Server (NTRS)

    Geohagan, Kevin W.; Bernard, William P.; Oliver, T. Emerson; Strickland, Dennis J.; Leggett, Jared O.

    2018-01-01

    The Navigation System on the NASA Space Launch System (SLS) Block 1 vehicle performs initial alignment of the Inertial Navigation System (INS) navigation frame through gyrocompass alignment (GCA). In lieu of direct testing of GCA accuracy in support of requirement verification, the SLS Navigation Team proposed and conducted an engineering test to, among other things, validate the GCA performance and overall behavior of the SLS INS model through comparison with test data. This paper will detail dynamic hardware testing of the SLS INS, conducted by the SLS Navigation Team at Marshall Space Flight Center's 6DOF Table Facility, in support of GCA performance characterization and INS model validation. A 6-DOF motion platform was used to produce 6DOF pad twist and sway dynamics while a simulated SLS flight computer communicated with the INS. Tests conducted include an evaluation of GCA algorithm robustness to increasingly dynamic pad environments, an examination of GCA algorithm stability and accuracy over long durations, and a long-duration static test to gather enough data for Allan Variance analysis. Test setup, execution, and data analysis will be discussed, including analysis performed in support of SLS INS model validation.

  20. SBMLeditor: effective creation of models in the Systems Biology Markup Language (SBML)

    PubMed Central

    Rodriguez, Nicolas; Donizelli, Marco; Le Novère, Nicolas

    2007-01-01

    Background The need to build a tool to facilitate the quick creation and editing of models encoded in the Systems Biology Markup language (SBML) has been growing with the number of users and the increased complexity of the language. SBMLeditor tries to answer this need by providing a very simple, low level editor of SBML files. Users can create and remove all the necessary bits and pieces of SBML in a controlled way, that maintains the validity of the final SBML file. Results SBMLeditor is written in JAVA using JCompneur, a library providing interfaces to easily display an XML document as a tree. This decreases dramatically the development time for a new XML editor. The possibility to include custom dialogs for different tags allows a lot of freedom for the editing and validation of the document. In addition to Xerces, SBMLeditor uses libSBML to check the validity and consistency of SBML files. A graphical equation editor allows an easy manipulation of MathML. SBMLeditor can be used as a module of the Systems Biology Workbench. Conclusion SBMLeditor contains many improvements compared to a generic XML editor, and allow users to create an SBML model quickly and without syntactic errors. PMID:17341299

  1. SBMLeditor: effective creation of models in the Systems Biology Markup language (SBML).

    PubMed

    Rodriguez, Nicolas; Donizelli, Marco; Le Novère, Nicolas

    2007-03-06

    The need to build a tool to facilitate the quick creation and editing of models encoded in the Systems Biology Markup language (SBML) has been growing with the number of users and the increased complexity of the language. SBMLeditor tries to answer this need by providing a very simple, low level editor of SBML files. Users can create and remove all the necessary bits and pieces of SBML in a controlled way, that maintains the validity of the final SBML file. SBMLeditor is written in JAVA using JCompneur, a library providing interfaces to easily display an XML document as a tree. This decreases dramatically the development time for a new XML editor. The possibility to include custom dialogs for different tags allows a lot of freedom for the editing and validation of the document. In addition to Xerces, SBMLeditor uses libSBML to check the validity and consistency of SBML files. A graphical equation editor allows an easy manipulation of MathML. SBMLeditor can be used as a module of the Systems Biology Workbench. SBMLeditor contains many improvements compared to a generic XML editor, and allow users to create an SBML model quickly and without syntactic errors.

  2. Validation of a Clinical Scoring System for Outcome Prediction in Dogs with Acute Kidney Injury Managed by Hemodialysis.

    PubMed

    Segev, G; Langston, C; Takada, K; Kass, P H; Cowgill, L D

    2016-05-01

    A scoring system for outcome prediction in dogs with acute kidney injury (AKI) recently has been developed but has not been validated. The scoring system previously developed for outcome prediction will accurately predict outcome in a validation cohort of dogs with AKI managed with hemodialysis. One hundred fifteen client-owned dogs with AKI. Medical records of dogs with AKI treated by hemodialysis between 2011 and 2015 were reviewed. Dogs were included only if all variables required to calculate the final predictive score were available, and the 30-day outcome was known. A predictive score for 3 models was calculated for each dog. Logistic regression was used to evaluate the association of the final predictive score with each model's outcome. Receiver operating curve (ROC) analyses were performed to determine sensitivity and specificity for each model based on previously established cut-off values. Higher scores for each model were associated with decreased survival probability (P < .001). Based on previously established cut-off values, 3 models (models A, B, C) were associated with sensitivities/specificities of 73/75%, 71/80%, and 75/86%, respectively, and correctly classified 74-80% of the dogs. All models were simple to apply and allowed outcome prediction that closely corresponded with actual outcome in an independent cohort. As expected, accuracies were slightly lower compared with those from the previously reported cohort used initially to develop the models. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  3. A mobile, high-throughput semi-automated system for testing cognition in large non-primate animal models of Huntington disease.

    PubMed

    McBride, Sebastian D; Perentos, Nicholas; Morton, A Jennifer

    2016-05-30

    For reasons of cost and ethical concerns, models of neurodegenerative disorders such as Huntington disease (HD) are currently being developed in farm animals, as an alternative to non-human primates. Developing reliable methods of testing cognitive function is essential to determining the usefulness of such models. Nevertheless, cognitive testing of farm animal species presents a unique set of challenges. The primary aims of this study were to develop and validate a mobile operant system suitable for high throughput cognitive testing of sheep. We designed a semi-automated testing system with the capability of presenting stimuli (visual, auditory) and reward at six spatial locations. Fourteen normal sheep were used to validate the system using a two-choice visual discrimination task. Four stages of training devised to acclimatise animals to the system are also presented. All sheep progressed rapidly through the training stages, over eight sessions. All sheep learned the 2CVDT and performed at least one reversal stage. The mean number of trials the sheep took to reach criterion in the first acquisition learning was 13.9±1.5 and for the reversal learning was 19.1±1.8. This is the first mobile semi-automated operant system developed for testing cognitive function in sheep. We have designed and validated an automated operant behavioural testing system suitable for high throughput cognitive testing in sheep and other medium-sized quadrupeds, such as pigs and dogs. Sheep performance in the two-choice visual discrimination task was very similar to that reported for non-human primates and strongly supports the use of farm animals as pre-clinical models for the study of neurodegenerative diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Validation of the SURE Program, phase 1

    NASA Technical Reports Server (NTRS)

    Dotson, Kelly J.

    1987-01-01

    Presented are the results of the first phase in the validation of the SURE (Semi-Markov Unreliability Range Evaluator) program. The SURE program gives lower and upper bounds on the death-state probabilities of a semi-Markov model. With these bounds, the reliability of a semi-Markov model of a fault-tolerant computer system can be analyzed. For the first phase in the validation, fifteen semi-Markov models were solved analytically for the exact death-state probabilities and these solutions compared to the corresponding bounds given by SURE. In every case, the SURE bounds covered the exact solution. The bounds, however, had a tendency to separate in cases where the recovery rate was slow or the fault arrival rate was fast.

  5. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  6. A global multiscale mathematical model for the human circulation with emphasis on the venous system.

    PubMed

    Müller, Lucas O; Toro, Eleuterio F

    2014-07-01

    We present a global, closed-loop, multiscale mathematical model for the human circulation including the arterial system, the venous system, the heart, the pulmonary circulation and the microcirculation. A distinctive feature of our model is the detailed description of the venous system, particularly for intracranial and extracranial veins. Medium to large vessels are described by one-dimensional hyperbolic systems while the rest of the components are described by zero-dimensional models represented by differential-algebraic equations. Robust, high-order accurate numerical methodology is implemented for solving the hyperbolic equations, which are adopted from a recent reformulation that includes variable material properties. Because of the large intersubject variability of the venous system, we perform a patient-specific characterization of major veins of the head and neck using MRI data. Computational results are carefully validated using published data for the arterial system and most regions of the venous system. For head and neck veins, validation is carried out through a detailed comparison of simulation results against patient-specific phase-contrast MRI flow quantification data. A merit of our model is its global, closed-loop character; the imposition of highly artificial boundary conditions is avoided. Applications in mind include a vast range of medical conditions. Of particular interest is the study of some neurodegenerative diseases, whose venous haemodynamic connection has recently been identified by medical researchers. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Validity and Reliability of the 8-Item Work Limitations Questionnaire.

    PubMed

    Walker, Timothy J; Tullar, Jessica M; Diamond, Pamela M; Kohl, Harold W; Amick, Benjamin C

    2017-12-01

    Purpose To evaluate factorial validity, scale reliability, test-retest reliability, convergent validity, and discriminant validity of the 8-item Work Limitations Questionnaire (WLQ) among employees from a public university system. Methods A secondary analysis using de-identified data from employees who completed an annual Health Assessment between the years 2009-2015 tested research aims. Confirmatory factor analysis (CFA) (n = 10,165) tested the latent structure of the 8-item WLQ. Scale reliability was determined using a CFA-based approach while test-retest reliability was determined using the intraclass correlation coefficient. Convergent/discriminant validity was tested by evaluating relations between the 8-item WLQ with health/performance variables for convergent validity (health-related work performance, number of chronic conditions, and general health) and demographic variables for discriminant validity (gender and institution type). Results A 1-factor model with three correlated residuals demonstrated excellent model fit (CFI = 0.99, TLI = 0.99, RMSEA = 0.03, and SRMR = 0.01). The scale reliability was acceptable (0.69, 95% CI 0.68-0.70) and the test-retest reliability was very good (ICC = 0.78). Low-to-moderate associations were observed between the 8-item WLQ and the health/performance variables while weak associations were observed between the demographic variables. Conclusions The 8-item WLQ demonstrated sufficient reliability and validity among employees from a public university system. Results suggest the 8-item WLQ is a usable alternative for studies when the more comprehensive 25-item WLQ is not available.

  8. A holistic approach to SIM platform and its application to early-warning satellite system

    NASA Astrophysics Data System (ADS)

    Sun, Fuyu; Zhou, Jianping; Xu, Zheyao

    2018-01-01

    This study proposes a new simulation platform named Simulation Integrated Management (SIM) for the analysis of parallel and distributed systems. The platform eases the process of designing and testing both applications and architectures. The main characteristics of SIM are flexibility, scalability, and expandability. To improve the efficiency of project development, new models of early-warning satellite system were designed based on the SIM platform. Finally, through a series of experiments, the correctness of SIM platform and the aforementioned early-warning satellite models was validated, and the systematical analyses for the orbital determination precision of the ballistic missile during its entire flight process were presented, as well as the deviation of the launch/landing point. Furthermore, the causes of deviation and prevention methods will be fully explained. The simulation platform and the models will lay the foundations for further validations of autonomy technology in space attack-defense architecture research.

  9. Performance Evaluation and Modeling of Erosion Resistant Turbine Engine Thermal Barrier Coatings

    NASA Technical Reports Server (NTRS)

    Miller, Robert A.; Zhu, Dongming; Kuczmarski, Maria

    2008-01-01

    The erosion resistant turbine thermal barrier coating system is critical to the rotorcraft engine performance and durability. The objective of this work was to determine erosion resistance of advanced thermal barrier coating systems under simulated engine erosion and thermal gradient environments, thus validating a new thermal barrier coating turbine blade technology for future rotorcraft applications. A high velocity burner rig based erosion test approach was established and a new series of rare earth oxide- and TiO2/Ta2O5- alloyed, ZrO2-based low conductivity thermal barrier coatings were designed and processed. The low conductivity thermal barrier coating systems demonstrated significant improvements in the erosion resistance. A comprehensive model based on accumulated strain damage low cycle fatigue is formulated for blade erosion life prediction. The work is currently aiming at the simulated engine erosion testing of advanced thermal barrier coated turbine blades to establish and validate the coating life prediction models.

  10. Microscopic simulation model calibration and validation handbook.

    DOT National Transportation Integrated Search

    2006-01-01

    Microscopic traffic simulation models are widely used in the transportation engineering field. Because of their cost-effectiveness, risk-free nature, and high-speed benefits, areas of use include transportation system design, traffic operations, and ...

  11. Validation of GEMACS (General Electromagnetic Model for the Analysis of Complex Systems) for Modeling Lightning-Induced Electromagnetic Fields.

    DTIC Science & Technology

    1987-12-01

    0 00 I DTIC"ELECTE. ~FEB 0 911988< " H VALIDATION OF GEMACS FOR MODELING ’LIGHTNING-INDUCED ELECTROMAGNETIC FIELDS THESIS David S. Mabee Captain...THESIS David S. Mabee . Captain, USAFD T C ’::, AFIT/GE/ENG/87D-39 ELECTFE r C:’., ~FEB 0 91988 J Approved for public release; distribution unlimited...Electrical Engineering David S. Mabee , B.S. ’- ,. . Captain, USAF December 1987 A o fr p.. ’ Approved for public release; distribution unlimited ,12

  12. Optimal control model predictions of system performance and attention allocation and their experimental validation in a display design study

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Govindaraj, T.

    1980-01-01

    The influence of different types of predictor displays in a longitudinal vertical takeoff and landing (VTOL) hover task is analyzed in a theoretical study. Several cases with differing amounts of predictive and rate information are compared. The optimal control model of the human operator is used to estimate human and system performance in terms of root-mean-square (rms) values and to compute optimized attention allocation. The only part of the model which is varied to predict these data is the observation matrix. Typical cases are selected for a subsequent experimental validation. The rms values as well as eye-movement data are recorded. The results agree favorably with those of the theoretical study in terms of relative differences. Better matching is achieved by revised model input data.

  13. Advanced Reactors-Intermediate Heat Exchanger (IHX) Coupling: Theoretical Modeling and Experimental Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard

    2016-12-29

    The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less

  14. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  15. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  16. Evaluating model structure adequacy: The case of the Maggia Valley groundwater system, southern Switzerland

    USGS Publications Warehouse

    Hill, Mary C.; L. Foglia,; S. W. Mehl,; P. Burlando,

    2013-01-01

    Model adequacy is evaluated with alternative models rated using model selection criteria (AICc, BIC, and KIC) and three other statistics. Model selection criteria are tested with cross-validation experiments and insights for using alternative models to evaluate model structural adequacy are provided. The study is conducted using the computer codes UCODE_2005 and MMA (MultiModel Analysis). One recharge alternative is simulated using the TOPKAPI hydrological model. The predictions evaluated include eight heads and three flows located where ecological consequences and model precision are of concern. Cross-validation is used to obtain measures of prediction accuracy. Sixty-four models were designed deterministically and differ in representation of river, recharge, bedrock topography, and hydraulic conductivity. Results include: (1) What may seem like inconsequential choices in model construction may be important to predictions. Analysis of predictions from alternative models is advised. (2) None of the model selection criteria consistently identified models with more accurate predictions. This is a disturbing result that suggests to reconsider the utility of model selection criteria, and/or the cross-validation measures used in this work to measure model accuracy. (3) KIC displayed poor performance for the present regression problems; theoretical considerations suggest that difficulties are associated with wide variations in the sensitivity term of KIC resulting from the models being nonlinear and the problems being ill-posed due to parameter correlations and insensitivity. The other criteria performed somewhat better, and similarly to each other. (4) Quantities with high leverage are more difficult to predict. The results are expected to be generally applicable to models of environmental systems.

  17. Dynamic modelling and experimental validation of three wheeled tilting vehicles

    NASA Astrophysics Data System (ADS)

    Amati, Nicola; Festini, Andrea; Pelizza, Luigi; Tonoli, Andrea

    2011-06-01

    The present paper describes the study of the stability in the straight running of a three-wheeled tilting vehicle for urban and sub-urban mobility. The analysis was carried out by developing a multibody model in the Matlab/SimulinkSimMechanics environment. An Adams-Motorcycle model and an equivalent analytical model were developed for the cross-validation and for highlighting the similarities with the lateral dynamics of motorcycles. Field tests were carried out to validate the model and identify some critical parameters, such as the damping on the steering system. The stability analysis demonstrates that the lateral dynamic motions are characterised by vibration modes that are similar to that of a motorcycle. Additionally, it shows that the wobble mode is significantly affected by the castor trail, whereas it is only slightly affected by the dynamics of the front suspension. For the present case study, the frame compliance also has no influence on the weave and wobble.

  18. Disturbance Reduction Control Design for the ST7 Flight Validation Experiment

    NASA Technical Reports Server (NTRS)

    Maghami, P. G.; Hsu, O. C.; Markley, F. L.; Houghton, M. B.

    2003-01-01

    The Space Technology 7 experiment will perform an on-orbit system-level validation of two specific Disturbance Reduction System technologies: a gravitational reference sensor employing a free-floating test mass, and a set of micro-Newton colloidal thrusters. The ST7 Disturbance Reduction System is designed to maintain the spacecraft's position with respect to a free-floating test mass to less than 10 nm/Hz, over the frequency range of 1 to 30 mHz. This paper presents the design and analysis of the coupled, drag-free and attitude control systems that close the loop between the gravitational reference sensor and the micro-Newton thrusters, while incorporating star tracker data at low frequencies. A full 18 degree-of-freedom model, which incorporates rigid-body models of the spacecraft and two test masses, is used to evaluate the effects of actuation and measurement noise and disturbances on the performance of the drag-free system.

  19. Prognostic score to predict mortality during TB treatment in TB/HIV co-infected patients.

    PubMed

    Nguyen, Duc T; Jenkins, Helen E; Graviss, Edward A

    2018-01-01

    Estimating mortality risk during TB treatment in HIV co-infected patients is challenging for health professionals, especially in a low TB prevalence population, due to the lack of a standardized prognostic system. The current study aimed to develop and validate a simple mortality prognostic scoring system for TB/HIV co-infected patients. Using data from the CDC's Tuberculosis Genotyping Information Management System of TB patients in Texas reported from 01/2010 through 12/2016, age ≥15 years, HIV(+), and outcome being "completed" or "died", we developed and internally validated a mortality prognostic score using multiple logistic regression. Model discrimination was determined by the area under the receiver operating characteristic (ROC) curve (AUC). The model's good calibration was determined by a non-significant Hosmer-Lemeshow's goodness of fit test. Among the 450 patients included in the analysis, 57 (12.7%) died during TB treatment. The final prognostic score used six characteristics (age, residence in long-term care facility, meningeal TB, chest x-ray, culture positive, and culture not converted/unknown), which are routinely collected by TB programs. Prognostic scores were categorized into three groups that predicted mortality: low-risk (<20 points), medium-risk (20-25 points) and high-risk (>25 points). The model had good discrimination and calibration (AUC = 0.82; 0.80 in bootstrap validation), and a non-significant Hosmer-Lemeshow test p = 0.71. Our simple validated mortality prognostic scoring system can be a practical tool for health professionals in identifying TB/HIV co-infected patients with high mortality risk.

  20. ENEL overall PWR plant models and neutronic integrated computing systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedroni, G.; Pollachini, L.; Vimercati, G.

    1987-01-01

    To support the design activity of the Italian nuclear energy program for the construction of pressurized water reactors, the Italian Electricity Board (ENEL) needs to verify the design as a whole (that is, the nuclear steam supply system and balance of plant) both in steady-state operation and in transient. The ENEL has therefore developed two computer models to analyze both operational and incidental transients. The models, named STRIP and SFINCS, perform the analysis of the nuclear as well as the conventional part of the plant (the control system being properly taken into account). The STRIP model has been developed bymore » means of the French (Electricite de France) modular code SICLE, while SFINCS is based on the Italian (ENEL) modular code LEGO. STRIP validation was performed with respect to Fessenheim French power plant experimental data. Two significant transients were chosen: load step and total load rejection. SFINCS validation was performed with respect to Saint-Laurent French power plant experimental data and also by comparing the SFINCS-STRIP responses.« less

  1. Model Verification and Validation Concepts for a Probabilistic Fracture Assessment Model to Predict Cracking of Knife Edge Seals in the Space Shuttle Main Engine High Pressure Oxidizer

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Riha, David S.

    2013-01-01

    Physics-based models are routinely used to predict the performance of engineered systems to make decisions such as when to retire system components, how to extend the life of an aging system, or if a new design will be safe or available. Model verification and validation (V&V) is a process to establish credibility in model predictions. Ideally, carefully controlled validation experiments will be designed and performed to validate models or submodels. In reality, time and cost constraints limit experiments and even model development. This paper describes elements of model V&V during the development and application of a probabilistic fracture assessment model to predict cracking in space shuttle main engine high-pressure oxidizer turbopump knife-edge seals. The objective of this effort was to assess the probability of initiating and growing a crack to a specified failure length in specific flight units for different usage and inspection scenarios. The probabilistic fracture assessment model developed in this investigation combined a series of submodels describing the usage, temperature history, flutter tendencies, tooth stresses and numbers of cycles, fatigue cracking, nondestructive inspection, and finally the probability of failure. The analysis accounted for unit-to-unit variations in temperature, flutter limit state, flutter stress magnitude, and fatigue life properties. The investigation focused on the calculation of relative risk rather than absolute risk between the usage scenarios. Verification predictions were first performed for three units with known usage and cracking histories to establish credibility in the model predictions. Then, numerous predictions were performed for an assortment of operating units that had flown recently or that were projected for future flights. Calculations were performed using two NASA-developed software tools: NESSUS(Registered Trademark) for the probabilistic analysis, and NASGRO(Registered Trademark) for the fracture mechanics analysis. The goal of these predictions was to provide additional information to guide decisions on the potential of reusing existing and installed units prior to the new design certification.

  2. [CLIMATE CHANGE AND ALLERGIC AIRWAY DISEASE] OBSERVATIONAL,LABORATORY, AND MODELING STUDIES OF THE IMPACTS OF CLIMATE CHANGE ONALLERGIC AIRWAY DISEASE

    EPA Science Inventory

    Based on these data and preliminary studies, this proposal will be composed of a multiscale source-to-dose analysis approach for assessing the exposure interactions of environmental and biological systems. Once the entire modeling system is validated, it will run f...

  3. 3-D and quasi-2-D discrete element modeling of grain commingling in a bucket elevator boot system

    USDA-ARS?s Scientific Manuscript database

    Unwanted grain commingling impedes new quality-based grain handling systems and has proven to be an expensive and time consuming issue to study experimentally. Experimentally validated models may reduce the time and expense of studying grain commingling while providing additional insight into detail...

  4. Model Verification and Validation Using Graphical Information Systems Tools

    DTIC Science & Technology

    2013-07-31

    Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no person shall be...12 Geomorphic Measurements...to a model. Ocean flows, which are organized E-2 current systems, transport heat and salinity and cause water to pile up as a water surface

  5. Model Transformation for a System of Systems Dependability Safety Case

    NASA Technical Reports Server (NTRS)

    Murphy, Judy; Driskell, Steve

    2011-01-01

    The presentation reviews the dependability and safety effort of NASA's Independent Verification and Validation Facility. Topics include: safety engineering process, applications to non-space environment, Phase I overview, process creation, sample SRM artifact, Phase I end result, Phase II model transformation, fault management, and applying Phase II to individual projects.

  6. Donabedian's structure-process-outcome quality of care model: Validation in an integrated trauma system.

    PubMed

    Moore, Lynne; Lavoie, André; Bourgeois, Gilles; Lapointe, Jean

    2015-06-01

    According to Donabedian's health care quality model, improvements in the structure of care should lead to improvements in clinical processes that should in turn improve patient outcome. This model has been widely adopted by the trauma community but has not yet been validated in a trauma system. The objective of this study was to assess the performance of an integrated trauma system in terms of structure, process, and outcome and evaluate the correlation between quality domains. Quality of care was evaluated for patients treated in a Canadian provincial trauma system (2005-2010; 57 centers, n = 63,971) using quality indicators (QIs) developed and validated previously. Structural performance was measured by transposing on-site accreditation visit reports onto an evaluation grid according to American College of Surgeons criteria. The composite process QI was calculated as the average sum of proportions of conformity to 15 process QIs derived from literature review and expert opinion. Outcome performance was measured using risk-adjusted rates of mortality, complications, and readmission as well as hospital length of stay (LOS). Correlation was assessed with Pearson's correlation coefficients. Statistically significant correlations were observed between structure and process QIs (r = 0.33), and process and outcome QIs (r = -0.33 for readmission, r = -0.27 for LOS). Significant positive correlations were also observed between outcome QIs (r = 0.37 for mortality-readmission; r = 0.39 for mortality-LOS and readmission-LOS; r = 0.45 for mortality-complications; r = 0.34 for readmission-complications; 0.63 for complications-LOS). Significant correlations between quality domains observed in this study suggest that Donabedian's structure-process-outcome model is a valid model for evaluating trauma care. Trauma centers that perform well in terms of structure also tend to perform well in terms of clinical processes, which in turn has a favorable influence on patient outcomes. Prognostic study, level III.

  7. Modeling and Validation of the Three Dimensional Deflection of an MRI-Compatible Magnetically-Actuated Steerable Catheter

    PubMed Central

    Liu, Taoming; Poirot, Nate Lombard; Franson, Dominique; Seiberlich, Nicole; Griswold, Mark A.; Çavuşoğlu, M. Cenk

    2016-01-01

    Objective This paper presents the three dimensional kinematic modeling of a novel steerable robotic ablation catheter system. The catheter, embedded with a set of current-carrying micro-coils, is actuated by the magnetic forces generated by the magnetic field of the magnetic resonance imaging (MRI) scanner. Methods This paper develops a 3D model of the MRI actuated steerable catheter system by using finite differences approach. For each finite segment, a quasi-static torque-deflection equilibrium equation is calculated using beam theory. By using the deflection displacements and torsion angles, the kinematic model of the catheter system is derived. Results The proposed models are validated by comparing the simulation results of the proposed model with the experimental results of a hardware prototype of the catheter design. The maximum tip deflection error is 4.70 mm and the maximum root-mean-square (RMS) error of the shape estimation is 3.48 mm. Conclusion The results demonstrate that the proposed model can successfully estimate the deflection motion of the catheter. Significance The presented three dimensional deflection model of the magnetically controlled catheter design paves the way to efficient control of the robotic catheter for treatment of atrial fibrillation. PMID:26731519

  8. Link performance model for filter bank based multicarrier systems

    NASA Astrophysics Data System (ADS)

    Petrov, Dmitry; Oborina, Alexandra; Giupponi, Lorenza; Stitz, Tobias Hidalgo

    2014-12-01

    This paper presents a complete link level abstraction model for link quality estimation on the system level of filter bank multicarrier (FBMC)-based networks. The application of mean mutual information per coded bit (MMIB) approach is validated for the FBMC systems. The considered quality measure of the resource element for the FBMC transmission is the received signal-to-noise-plus-distortion ratio (SNDR). Simulation results of the proposed link abstraction model show that the proposed approach is capable of estimating the block error rate (BLER) accurately, even when the signal is propagated through the channels with deep and frequent fades, as it is the case for the 3GPP Hilly Terrain (3GPP-HT) and Enhanced Typical Urban (ETU) models. The FBMC-related results of link level simulations are compared with cyclic prefix orthogonal frequency division multiplexing (CP-OFDM) analogs. Simulation results are also validated through the comparison to reference publicly available results. Finally, the steps of link level abstraction algorithm for FBMC are formulated and its application for system level simulation of a professional mobile radio (PMR) network is discussed.

  9. Modeling of a lot scale rainwater tank system in XP-SWMM: a case study in Western Sydney, Australia.

    PubMed

    van der Sterren, Marlène; Rahman, Ataur; Ryan, Garry

    2014-08-01

    Lot scale rainwater tank system modeling is often used in sustainable urban storm water management, particularly to estimate the reduction in the storm water run-off and pollutant wash-off at the lot scale. These rainwater tank models often cannot be adequately calibrated and validated due to limited availability of observed rainwater tank quantity and quality data. This paper presents calibration and validation of a lot scale rainwater tank system model using XP-SWMM utilizing data collected from two rainwater tank systems located in Western Sydney, Australia. The modeling considers run-off peak and volume in and out of the rainwater tank system and also a number of water quality parameters (Total Phosphorus (TP), Total Nitrogen (TN) and Total Solids (TS)). It has been found that XP-SWMM can be used successfully to develop a lot scale rainwater system model within an acceptable error margin. It has been shown that TP and TS can be predicted more accurately than TN using the developed model. In addition, it was found that a significant reduction in storm water run-off discharge can be achieved as a result of the rainwater tank up to about one year average recurrence interval rainfall event. The model parameter set assembled in this study can be used for developing lot scale rainwater tank system models at other locations in the Western Sydney region and in other parts of Australia with necessary adjustments for the local site characteristics. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. A Greenhouse-Gas Information System: Monitoring and Validating Emissions Reporting and Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonietz, Karl K.; Dimotakis, Paul E.; Rotman, Douglas A.

    2011-09-26

    This study and report focus on attributes of a greenhouse-gas information system (GHGIS) needed to support MRV&V needs. These needs set the function of such a system apart from scientific/research monitoring of GHGs and carbon-cycle systems, and include (not exclusively): the need for a GHGIS that is operational, as required for decision-support; the need for a system that meets specifications derived from imposed requirements; the need for rigorous calibration, verification, and validation (CV&V) standards, processes, and records for all measurement and modeling/data-inversion data; the need to develop and adopt an uncertainty-quantification (UQ) regimen for all measurement and modeling data; andmore » the requirement that GHGIS products can be subjected to third-party questioning and scientific scrutiny. This report examines and assesses presently available capabilities that could contribute to a future GHGIS. These capabilities include sensors and measurement technologies; data analysis and data uncertainty quantification (UQ) practices and methods; and model-based data-inversion practices, methods, and their associated UQ. The report further examines the need for traceable calibration, verification, and validation processes and attached metadata; differences between present science-/research-oriented needs and those that would be required for an operational GHGIS; the development, operation, and maintenance of a GHGIS missions-operations center (GMOC); and the complex systems engineering and integration that would be required to develop, operate, and evolve a future GHGIS.« less

  11. Developing Capture Mechanisms and High-Fidelity Dynamic Models for the MXER Tether System

    NASA Technical Reports Server (NTRS)

    Canfield, Steven L.

    2007-01-01

    A team consisting of collaborators from Tennessee Technological University (TTU), Marshall Space Flight Center, BD Systems, and the University of Delaware (herein called the TTU team) conducted specific research and development activities in MXER tether systems during the base period of May 15, 2004 through September 30, 2006 under contract number NNM04AB13C. The team addressed two primary topics related to the MXER tether system: 1) Development of validated high-fidelity dynamic models of an elastic rotating tether and 2) development of feasible mechanisms to enable reliable rendezvous and capture. This contractor report will describe in detail the activities that were performed during the base period of this cycle-2 MXER tether activity and will summarize the results of this funded activity. The primary deliverables of this project were the quad trap, a robust capture mechanism proposed, developed, tested, and demonstrated with a high degree of feasibility and the detailed development of a validated high-fidelity elastic tether dynamic model provided through multiple formulations.

  12. Use of a vision model to quantify the significance of factors effecting target conspicuity

    NASA Astrophysics Data System (ADS)

    Gilmore, M. A.; Jones, C. K.; Haynes, A. W.; Tolhurst, D. J.; To, M.; Troscianko, T.; Lovell, P. G.; Parraga, C. A.; Pickavance, K.

    2006-05-01

    When designing camouflage it is important to understand how the human visual system processes the information to discriminate the target from the background scene. A vision model has been developed to compare two images and detect differences in local contrast in each spatial frequency channel. Observer experiments are being undertaken to validate this vision model so that the model can be used to quantify the relative significance of different factors affecting target conspicuity. Synthetic imagery can be used to design improved camouflage systems. The vision model is being used to compare different synthetic images to understand what features in the image are important to reproduce accurately and to identify the optimum way to render synthetic imagery for camouflage effectiveness assessment. This paper will describe the vision model and summarise the results obtained from the initial validation tests. The paper will also show how the model is being used to compare different synthetic images and discuss future work plans.

  13. Experimental validation of a 0-D numerical model for phase change thermal management systems in lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Schweitzer, Ben; Wilke, Stephen; Khateeb, Siddique; Al-Hallaj, Said

    2015-08-01

    A lumped (0-D) numerical model has been developed for simulating the thermal response of a lithium-ion battery pack with a phase-change composite (PCC™) thermal management system. A small 10s4p battery pack utilizing PCC material was constructed and subjected to discharge at various C-rates in order to validate the lumped model. The 18650 size Li-ion cells used in the pack were electrically characterized to determine their heat generation, and various PCC materials were thermally characterized to determine their apparent specific heat as a function of temperature. Additionally, a 2-D FEA thermal model was constructed to help understand the magnitude of spatial temperature variation in the pack, and to understand the limitations of the lumped model. Overall, good agreement is seen between experimentally measured pack temperatures and the 0-D model, and the 2-D FEA model predicts minimal spatial temperature variation for PCC-based packs at C-rates of 1C and below.

  14. Development of an Unstructured, Three-Dimensional Material Response Design Tool

    NASA Technical Reports Server (NTRS)

    Schulz, Joseph; Stern, Eric; Palmer, Grant; Muppidi, Suman; Schroeder, Olivia

    2017-01-01

    A preliminary verification and validation of a new material response model is presented. This model, Icarus, is intended to serve as a design tool for the thermal protection systems of re-entry vehicles. Currently, the capability of the model is limited to simulating the pyrolysis of a material as a result of the radiative and convective surface heating imposed on the material from the surrounding high enthalpy gas. Since the major focus behind the development of Icarus has been model extensibility, the hope is that additional physics can be quickly added. The extensibility is critical since thermal protection systems are becoming increasing complex, e.g. woven carbon polymers. Additionally, as a three-dimensional, unstructured, finite-volume model, Icarus is capable of modeling complex geometries as well as multi-dimensional physics, which have been shown to be important in some scenarios and are not captured by one-dimensional models. In this paper, the mathematical and numerical formulation is presented followed by a discussion of the software architecture and some preliminary verification and validation studies.

  15. HyPEP FY06 Report: Models and Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DOE report

    2006-09-01

    The Department of Energy envisions the next generation very high-temperature gas-cooled reactor (VHTR) as a single-purpose or dual-purpose facility that produces hydrogen and electricity. The Ministry of Science and Technology (MOST) of the Republic of Korea also selected VHTR for the Nuclear Hydrogen Development and Demonstration (NHDD) Project. This research project aims at developing a user-friendly program for evaluating and optimizing cycle efficiencies of producing hydrogen and electricity in a Very-High-Temperature Reactor (VHTR). Systems for producing electricity and hydrogen are complex and the calculations associated with optimizing these systems are intensive, involving a large number of operating parameter variations andmore » many different system configurations. This research project will produce the HyPEP computer model, which is specifically designed to be an easy-to-use and fast running tool for evaluating nuclear hydrogen and electricity production facilities. The model accommodates flexible system layouts and its cost models will enable HyPEP to be well-suited for system optimization. Specific activities of this research are designed to develop the HyPEP model into a working tool, including (a) identifying major systems and components for modeling, (b) establishing system operating parameters and calculation scope, (c) establishing the overall calculation scheme, (d) developing component models, (e) developing cost and optimization models, and (f) verifying and validating the program. Once the HyPEP model is fully developed and validated, it will be used to execute calculations on candidate system configurations. FY-06 report includes a description of reference designs, methods used in this study, models and computational strategies developed for the first year effort. Results from computer codes such as HYSYS and GASS/PASS-H used by Idaho National Laboratory and Argonne National Laboratory, respectively will be benchmarked with HyPEP results in the following years.« less

  16. A Performance Prediction Model for a Fault-Tolerant Computer During Recovery and Restoration. Ph.D. Thesis Report, 1 Jan. - 31 Dec. 1992

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Obando, Rodrigo A.

    1993-01-01

    The modeling and design of a fault-tolerant multiprocessor system is addressed. In particular, the behavior of the system during recovery and restoration after a fault has occurred is investigated. Given that a multicomputer system is designed using the Algorithm to Architecture to Mapping Model (ATAMM), and that a fault (death of a computing resource) occurs during its normal steady-state operation, a model is presented as a viable research tool for predicting the performance bounds of the system during its recovery and restoration phases. Furthermore, the bounds of the performance behavior of the system during this transient mode can be assessed. These bounds include: time to recover from the fault (t(sub rec)), time to restore the system (t(sub rec)) and whether there is a permanent delay in the system's Time Between Input and Output (TBIO) after the system has reached a steady state. An implementation of an ATAMM based computer was developed with the Generic VHSIC Spaceborne Computer (GVSC) as the target system. A simulation of the GVSC was also written based on the code used in ATAMM Multicomputer Operating System (AMOS). The simulation is in turn used to validate the new model in the usefulness and accuracy in tracking the propagation of the delay through the system and predicting the behavior in the transient state of recovery and restoration. The model is validated as an accurate method to predict the transient behavior of an ATAMM based multicomputer during recovery and restoration.

  17. A cell-based assay for aggregation inhibitors as therapeutics of polyglutamine-repeat disease and validation in Drosophila

    NASA Astrophysics Data System (ADS)

    Apostol, Barbara L.; Kazantsev, Alexsey; Raffioni, Simona; Illes, Katalin; Pallos, Judit; Bodai, Laszlo; Slepko, Natalia; Bear, James E.; Gertler, Frank B.; Hersch, Steven; Housman, David E.; Marsh, J. Lawrence; Michels Thompson, Leslie

    2003-05-01

    The formation of polyglutamine-containing aggregates and inclusions are hallmarks of pathogenesis in Huntington's disease that can be recapitulated in model systems. Although the contribution of inclusions to pathogenesis is unclear, cell-based assays can be used to screen for chemical compounds that affect aggregation and may provide therapeutic benefit. We have developed inducible PC12 cell-culture models to screen for loss of visible aggregates. To test the validity of this approach, compounds that inhibit aggregation in the PC12 cell-based screen were tested in a Drosophila model of polyglutamine-repeat disease. The disruption of aggregation in PC12 cells strongly correlates with suppression of neuronal degeneration in Drosophila. Thus, the engineered PC12 cells coupled with the Drosophila model provide a rapid and effective method to screen and validate compounds.

  18. The ALADIN System and its canonical model configurations AROME CY41T1 and ALARO CY40T1

    NASA Astrophysics Data System (ADS)

    Termonia, Piet; Fischer, Claude; Bazile, Eric; Bouyssel, François; Brožková, Radmila; Bénard, Pierre; Bochenek, Bogdan; Degrauwe, Daan; Derková, Mariá; El Khatib, Ryad; Hamdi, Rafiq; Mašek, Ján; Pottier, Patricia; Pristov, Neva; Seity, Yann; Smolíková, Petra; Španiel, Oldřich; Tudor, Martina; Wang, Yong; Wittmann, Christoph; Joly, Alain

    2018-01-01

    The ALADIN System is a numerical weather prediction (NWP) system developed by the international ALADIN consortium for operational weather forecasting and research purposes. It is based on a code that is shared with the global model IFS of the ECMWF and the ARPEGE model of Météo-France. Today, this system can be used to provide a multitude of high-resolution limited-area model (LAM) configurations. A few configurations are thoroughly validated and prepared to be used for the operational weather forecasting in the 16 partner institutes of this consortium. These configurations are called the ALADIN canonical model configurations (CMCs). There are currently three CMCs: the ALADIN baseline CMC, the AROME CMC and the ALARO CMC. Other configurations are possible for research, such as process studies and climate simulations. The purpose of this paper is (i) to define the ALADIN System in relation to the global counterparts IFS and ARPEGE, (ii) to explain the notion of the CMCs, (iii) to document their most recent versions, and (iv) to illustrate the process of the validation and the porting of these configurations to the operational forecast suites of the partner institutes of the ALADIN consortium. This paper is restricted to the forecast model only; data assimilation techniques and postprocessing techniques are part of the ALADIN System but they are not discussed here.

  19. Use of the Ames Check Standard Model for the Validation of Wall Interference Corrections

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Amaya, M.; Flach, R.

    2018-01-01

    The new check standard model of the NASA Ames 11-ft Transonic Wind Tunnel was chosen for a future validation of the facility's wall interference correction system. The chosen validation approach takes advantage of the fact that test conditions experienced by a large model in the slotted part of the tunnel's test section will change significantly if a subset of the slots is temporarily sealed. Therefore, the model's aerodynamic coefficients have to be recorded, corrected, and compared for two different test section configurations in order to perform the validation. Test section configurations with highly accurate Mach number and dynamic pressure calibrations were selected for the validation. First, the model is tested with all test section slots in open configuration while keeping the model's center of rotation on the tunnel centerline. In the next step, slots on the test section floor are sealed and the model is moved to a new center of rotation that is 33 inches below the tunnel centerline. Then, the original angle of attack sweeps are repeated. Afterwards, wall interference corrections are applied to both test data sets and response surface models of the resulting aerodynamic coefficients in interference-free flow are generated. Finally, the response surface models are used to predict the aerodynamic coefficients for a family of angles of attack while keeping dynamic pressure, Mach number, and Reynolds number constant. The validation is considered successful if the corrected aerodynamic coefficients obtained from the related response surface model pair show good agreement. Residual differences between the corrected coefficient sets will be analyzed as well because they are an indicator of the overall accuracy of the facility's wall interference correction process.

  20. Development of a Twin-Spool Turbofan Engine Simulation Using the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS)

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia M.; Chapman, Jeffryes W.; Lavelle, Thomas M.; Litt, Jonathan S.

    2014-01-01

    The Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS) is a tool that has been developed to allow a user to build custom models of systems governed by thermodynamic principles using a template to model each basic process. Validation of this tool in an engine model application was performed through reconstruction of the Commercial Modular Aero-Propulsion System Simulation (C-MAPSS) (v2) using the building blocks from the T-MATS (v1) library. In order to match the two engine models, it was necessary to address differences in several assumptions made in the two modeling approaches. After these modifications were made, validation of the engine model continued by integrating both a steady-state and dynamic iterative solver with the engine plant and comparing results from steady-state and transient simulation of the T-MATS and C-MAPSS models. The results show that the T-MATS engine model was accurate within 3% of the C-MAPSS model, with inaccuracy attributed to the increased dimension of the iterative solver solution space required by the engine model constructed using the T-MATS library. This demonstrates that, given an understanding of the modeling assumptions made in T-MATS and a baseline model, the T-MATS tool provides a viable option for constructing a computational model of a twin-spool turbofan engine that may be used in simulation studies.

  1. Understanding Dynamic Model Validation of a Wind Turbine Generator and a Wind Power Plant: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muljadi, Eduard; Zhang, Ying Chen; Gevorgian, Vahan

    Regional reliability organizations require power plants to validate the dynamic models that represent them to ensure that power systems studies are performed to the best representation of the components installed. In the process of validating a wind power plant (WPP), one must be cognizant of the parameter settings of the wind turbine generators (WTGs) and the operational settings of the WPP. Validating the dynamic model of a WPP is required to be performed periodically. This is because the control parameters of the WTGs and the other supporting components within a WPP may be modified to comply with new grid codesmore » or upgrades to the WTG controller with new capabilities developed by the turbine manufacturers or requested by the plant owners or operators. The diversity within a WPP affects the way we represent it in a model. Diversity within a WPP may be found in the way the WTGs are controlled, the wind resource, the layout of the WPP (electrical diversity), and the type of WTGs used. Each group of WTGs constitutes a significant portion of the output power of the WPP, and their unique and salient behaviors should be represented individually. The objective of this paper is to illustrate the process of dynamic model validations of WTGs and WPPs, the available data recorded that must be screened before it is used for the dynamic validations, and the assumptions made in the dynamic models of the WTG and WPP that must be understood. Without understanding the correct process, the validations may lead to the wrong representations of the WTG and WPP modeled.« less

  2. Revalidation of the NASA Ames 11-by 11-Foot Transonic Wind Tunnel with a Commercial Airplane Model

    NASA Technical Reports Server (NTRS)

    Kmak, Frank J.; Hudgins, M.; Hergert, D.; George, Michael W. (Technical Monitor)

    2001-01-01

    The 11-By 11-Foot Transonic leg of the Unitary Plan Wind Tunnel (UPWT) was modernized to improve tunnel performance, capability, productivity, and reliability. Wind tunnel tests to demonstrate the readiness of the tunnel for a return to production operations included an Integrated Systems Test (IST), calibration tests, and airplane validation tests. One of the two validation tests was a 0.037-scale Boeing 777 model that was previously tested in the 11-By 11-Foot tunnel in 1991. The objective of the validation tests was to compare pre-modernization and post-modernization results from the same airplane model in order to substantiate the operational readiness of the facility. Evaluation of within-test, test-to-test, and tunnel-to-tunnel data repeatability were made to study the effects of the tunnel modifications. Tunnel productivity was also evaluated to determine the readiness of the facility for production operations. The operation of the facility, including model installation, tunnel operations, and the performance of tunnel systems, was observed and facility deficiency findings generated. The data repeatability studies and tunnel-to-tunnel comparisons demonstrated outstanding data repeatability and a high overall level of data quality. Despite some operational and facility problems, the validation test was successful in demonstrating the readiness of the facility to perform production airplane wind tunnel%, tests.

  3. Experimental validation of finite element model analysis of a steel frame in simulated post-earthquake fire environments

    NASA Astrophysics Data System (ADS)

    Huang, Ying; Bevans, W. J.; Xiao, Hai; Zhou, Zhi; Chen, Genda

    2012-04-01

    During or after an earthquake event, building system often experiences large strains due to shaking effects as observed during recent earthquakes, causing permanent inelastic deformation. In addition to the inelastic deformation induced by the earthquake effect, the post-earthquake fires associated with short fuse of electrical systems and leakage of gas devices can further strain the already damaged structures during the earthquakes, potentially leading to a progressive collapse of buildings. Under these harsh environments, measurements on the involved building by various sensors could only provide limited structural health information. Finite element model analysis, on the other hand, if validated by predesigned experiments, can provide detail structural behavior information of the entire structures. In this paper, a temperature dependent nonlinear 3-D finite element model (FEM) of a one-story steel frame is set up by ABAQUS based on the cited material property of steel from EN 1993-1.2 and AISC manuals. The FEM is validated by testing the modeled steel frame in simulated post-earthquake environments. Comparisons between the FEM analysis and the experimental results show that the FEM predicts the structural behavior of the steel frame in post-earthquake fire conditions reasonably. With experimental validations, the FEM analysis of critical structures could be continuously predicted for structures in these harsh environments for a better assistant to fire fighters in their rescue efforts and save fire victims.

  4. Protocol and Demonstrations of Probabilistic Reliability Assessment for Structural Health Monitoring Systems (Preprint)

    DTIC Science & Technology

    2011-11-01

    assessment to quality of localization/characterization estimates. This protocol includes four critical components: (1) a procedure to identify the...critical factors impacting SHM system performance; (2) a multistage or hierarchical approach to SHM system validation; (3) a model -assisted evaluation...Lindgren, E. A ., Buynak, C. F., Steffes, G., Derriso, M., “ Model -assisted Probabilistic Reliability Assessment for Structural Health Monitoring

  5. Modelling, validation and analysis of a three-dimensional railway vehicle-track system model with linear and nonlinear track properties in the presence of wheel flats

    NASA Astrophysics Data System (ADS)

    Uzzal, R. U. A.; Ahmed, A. K. W.; Bhat, R. B.

    2013-11-01

    This paper presents dynamic contact loads at wheel-rail contact point in a three-dimensional railway vehicle-track model as well as dynamic response at vehicle-track component levels in the presence of wheel flats. The 17-degrees of freedom lumped mass vehicle is modelled as a full car body, two bogies and four wheelsets, whereas the railway track is modelled as two parallel Timoshenko beams periodically supported by lumped masses representing the sleepers. The rail beam is also supported by nonlinear spring and damper elements representing the railpad and ballast. In order to ensure the interactions between the railpads, a shear parameter beneath the rail beams has also been considered into the model. The wheel-rail contact is modelled using nonlinear Hertzian contact theory. In order to solve the coupled partial and ordinary differential equations of the vehicle-track system, modal analysis method is employed. Idealised Haversine wheel flats with the rounded corner are included in the wheel-rail contact model. The developed model is validated with the existing measured and analytical data available in the literature. The nonlinear model is then employed to investigate the wheel-rail impact forces that arise in the wheel-rail interface due to the presence of wheel flats. The validated model is further employed to investigate the dynamic responses of vehicle and track components in terms of displacement, velocity, and acceleration in the presence of single wheel flat.

  6. A performance improvement case study in aircraft maintenance and its implications for hazard identification.

    PubMed

    Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony

    2010-02-01

    Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.

  7. General model and control of an n rotor helicopter

    NASA Astrophysics Data System (ADS)

    Sidea, A. G.; Yding Brogaard, R.; Andersen, N. A.; Ravn, O.

    2014-12-01

    The purpose of this study was to create a dynamic, nonlinear mathematical model of a multirotor that would be valid for different numbers of rotors. Furthermore, a set of Single Input Single Output (SISO) controllers were implemented for attitude control. Both model and controllers were tested experimentally on a quadcopter. Using the combined model and controllers, simple system simulation and control is possible, by replacing the physical values for the individual systems.

  8. Computational Modeling of Space Physiology

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  9. Estimation of dynamic rotor loads for the rotor systems research aircraft: Methodology development and validation

    NASA Technical Reports Server (NTRS)

    Duval, R. W.; Bahrami, M.

    1985-01-01

    The Rotor Systems Research Aircraft uses load cells to isolate the rotor/transmission systm from the fuselage. A mathematical model relating applied rotor loads and inertial loads of the rotor/transmission system to the load cell response is required to allow the load cells to be used to estimate rotor loads from flight data. Such a model is derived analytically by applying a force and moment balance to the isolated rotor/transmission system. The model is tested by comparing its estimated values of applied rotor loads with measured values obtained from a ground based shake test. Discrepancies in the comparison are used to isolate sources of unmodeled external loads. Once the structure of the mathematical model has been validated by comparison with experimental data, the parameters must be identified. Since the parameters may vary with flight condition it is desirable to identify the parameters directly from the flight data. A Maximum Likelihood identification algorithm is derived for this purpose and tested using a computer simulation of load cell data. The identification is found to converge within 10 samples. The rapid convergence facilitates tracking of time varying parameters of the load cell model in flight.

  10. Calibrating Parameters of Power System Stability Models using Advanced Ensemble Kalman Filter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Diao, Ruisheng; Li, Yuanyuan

    With the ever increasing penetration of renewable energy, smart loads, energy storage, and new market behavior, today’s power grid becomes more dynamic and stochastic, which may invalidate traditional study assumptions and pose great operational challenges. Thus, it is of critical importance to maintain good-quality models for secure and economic planning and real-time operation. Following the 1996 Western Systems Coordinating Council (WSCC) system blackout, North American Electric Reliability Corporation (NERC) and Western Electricity Coordinating Council (WECC) in North America enforced a number of policies and standards to guide the power industry to periodically validate power grid models and calibrate poor parametersmore » with the goal of building sufficient confidence in model quality. The PMU-based approach using online measurements without interfering with the operation of generators provides a low-cost alternative to meet NERC standards. This paper presents an innovative procedure and tool suites to validate and calibrate models based on a trajectory sensitivity analysis method and an advanced ensemble Kalman filter algorithm. The developed prototype demonstrates excellent performance in identifying and calibrating bad parameters of a realistic hydro power plant against multiple system events.« less

  11. Identification of nonlinear modes using phase-locked-loop experimental continuation and normal form

    NASA Astrophysics Data System (ADS)

    Denis, V.; Jossic, M.; Giraud-Audine, C.; Chomette, B.; Renault, A.; Thomas, O.

    2018-06-01

    In this article, we address the model identification of nonlinear vibratory systems, with a specific focus on systems modeled with distributed nonlinearities, such as geometrically nonlinear mechanical structures. The proposed strategy theoretically relies on the concept of nonlinear modes of the underlying conservative unforced system and the use of normal forms. Within this framework, it is shown that without internal resonance, a valid reduced order model for a nonlinear mode is a single Duffing oscillator. We then propose an efficient experimental strategy to measure the backbone curve of a particular nonlinear mode and we use it to identify the free parameters of the reduced order model. The experimental part relies on a Phase-Locked Loop (PLL) and enables a robust and automatic measurement of backbone curves as well as forced responses. It is theoretically and experimentally shown that the PLL is able to stabilize the unstable part of Duffing-like frequency responses, thus enabling its robust experimental measurement. Finally, the whole procedure is tested on three experimental systems: a circular plate, a chinese gong and a piezoelectric cantilever beam. It enable to validate the procedure by comparison to available theoretical models as well as to other experimental identification methods.

  12. A cross-validation package driving Netica with python

    USGS Publications Warehouse

    Fienen, Michael N.; Plant, Nathaniel G.

    2014-01-01

    Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).

  13. Modeling of a thermally integrated 10 kWe planar solid oxide fuel cell system with anode offgas recycling and internal reforming by discretization in flow direction

    NASA Astrophysics Data System (ADS)

    Wahl, Stefanie; Segarra, Ana Gallet; Horstmann, Peter; Carré, Maxime; Bessler, Wolfgang G.; Lapicque, François; Friedrich, K. Andreas

    2015-04-01

    Combined heat and power production (CHP) based on solid oxide fuel cells (SOFC) is a very promising technology to achieve high electrical efficiency to cover power demand by decentralized production. This paper presents a dynamic quasi 2D model of an SOFC system which consists of stack and balance of plant and includes thermal coupling between the single components. The model is implemented in Modelica® and validated with experimental data for the stack UI-characteristic and the thermal behavior. The good agreement between experimental and simulation results demonstrates the validity of the model. Different operating conditions and system configurations are tested, increasing the net electrical efficiency to 57% by implementing an anode offgas recycle rate of 65%. A sensitivity analysis of characteristic values of the system like fuel utilization, oxygen-to-carbon ratio and electrical efficiency for different natural gas compositions is carried out. The result shows that a control strategy adapted to variable natural gas composition and its energy content should be developed in order to optimize the operation of the system.

  14. Simulation of morphodinamic processes in small coastal systems: application to the Aljezur coastal stream (Portugal)

    NASA Astrophysics Data System (ADS)

    Guerreiro, Martha; Fortunato, André B.; Oliveira, Anabela; Bertin, Xavier; Bruneau, Nicolas; Rodrigues, Marta

    2010-05-01

    In small and shallow coastal streams, morphological changes may have a dramatic effect on tidal propagation and distortion, on hydrodynamics and, ultimately, on the transport and fate of water-borne material. Hence, the ability to simulate the morphodynamic evolution of these dynamic and complex systems can be required for water quality studies. This work aimed at implementing, validating and exploring the morphodynamic modelling system MORSYS2D (Fortunato and Oliveira, 2004, Bertin et al., 2009) in the Aljezur stream, a small and dynamic coastal system located in south-west Portugal. Four extensive field campaigns were carried out in 2008 and 2009 to measure bathymetry, water levels, waves and currents, in both the estuary and the adjoining beach. Between the two 2009 campaigns, bathymetry was measured on a monthly basis. Data revealed significant morphological changes, including channel migration and the formation of sandbars. The morphodynamic modelling system MORSYS2D consists of a wave model (SWAN - Booij et al., 1999), a circulation model (ELCIRC - Zhang et al., 2004) and a sediment transport and bottom update model (SAND2D, Fortunato and Oliveira, 2004), and is controlled by a script that runs the models, manages the transfer of information between them and performs control checks. The model was shown to reproduce successfully the waves, the water levels and the velocities. Preliminary morphodynamic simulations revealed that the model is highly sensitive to small changes in the initial conditions, the parameterization of friction and the sediment transport formulation. This presentation will describe the calibration and validation of the morphodynamic modelling system and will investigate on the circumstances that can lead to the inlet closure (including wave action and river flow). Acknowledgements This work was sponsored by the Portuguese Science and Technology Foundation (FCT), project MADyCOS (PTDC/ECM/66484/2006). The authors thank the developers of the models ELCIRC and SWAN for making their source codes available and Guillaume Dodet for providing the time-series of wave spectra. The first author is grateful to Prof. João Dias for the orientation provided during this work. This research would not have been be possible without the participants in the field campaigns: R. Taborda, C. Andrade, C. Freitas, A.M. Silva, C. Antunes (Faculdade de Ciências de Lisboa), L. David, P. Freire, R. Capitão, C.J.E.M Fortes, L.S. Pedro, J. Vale, A. Nahon, D. Neves, C. Zózimo, L. Pinheiro (LNEC), A. Cravo, M. Rosa, C. Monteiro, S. Cardeira and C. Loureiro (Universidade do Algarve). The authors are grateful for all the effort and support. References Bertin, X., Oliveira, A. and Fortunato, A.B. 2009. Simulating morphodynamics with unstructured grids: description and validation of a modeling system for coastal applications, Ocean Modelling, 28/1-3: 75-87. Booij, N., Ris, R.C. and Holthuijsen, L.H., 1999. A third generation wave model for coastal regions; Part I: model description and validation. Journal of Geophysical Research, 104: 7649-7666. Dodet, G., Bertin, X. and Taborda, R. 2010. Wave climate variability in the North-East Atlantic Ocean over the last six decades, Ocean Modelling, 31: 120 - 131. Fortunato, A.B. and Oliveira, A. 2004. A modeling system for tidally driven long-term morphodynamics, Journal of Hydraulic Research, 42/4: 426-434. Zhang, Y.-L., Baptista, A.M. and Myers, E. P. 2004. A cross-scale model for 3D baroclinic circulation in estuary-plume-shelf systems: I. Formulation and skill assessment, Continental Shelf Research, 24/18: 2187-2214.

  15. Impact Testing on Reinforced Carbon-Carbon Flat Panels with Ice Projectiles for the Space Shuttle Return to Flight Program

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.; Revilock, Duane M.; Pereira, Michael J.; Lyle, Karen H.

    2009-01-01

    Following the tragedy of the Orbiter Columbia (STS-107) on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the space shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize reinforced carbon-carbon (RCC) along with ice and foam debris materials, which could shed on ascent and impact the orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS-DYNA (Livermore Software Technology Corp.) to predict damage by potential and actual impact events on the orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: Level 1--fundamental tests to obtain independent static and dynamic constitutive model properties of materials of interest, Level 2--subcomponent impact tests to provide highly controlled impact test data for the correlation and validation of the models, and Level 3--full-scale orbiter leading-edge impact tests to establish the final level of confidence for the analysis methodology. This report discusses the Level 2 test program conducted in the NASA Glenn Research Center (GRC) Ballistic Impact Laboratory with ice projectile impact tests on flat RCC panels, and presents the data observed. The Level 2 testing consisted of 54 impact tests in the NASA GRC Ballistic Impact Laboratory on 6- by 6-in. and 6- by 12-in. flat plates of RCC and evaluated three types of debris projectiles: Single-crystal, polycrystal, and "soft" ice. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile and validated the use of the ice and RCC models for use in LS-DYNA.

  16. Impact Testing on Reinforced Carbon-Carbon Flat Panels With BX-265 and PDL-1034 External Tank Foam for the Space Shuttle Return to Flight Program

    NASA Technical Reports Server (NTRS)

    Melis, Matthew E.; Revilock, Duane M.; Pereira, Michael J.; Lyle, Karen H.

    2009-01-01

    Following the tragedy of the Orbiter Columbia (STS-107) on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the space shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize reinforced carbon-carbon (RCC) along with ice and foam debris materials, which could shed on ascent and impact the orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS-DYNA (Livermore Software Technology Corp.) to predict damage by potential and actual impact events on the orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: Level 1-fundamental tests to obtain independent static and dynamic constitutive model properties of materials of interest, Level 2-subcomponent impact tests to provide highly controlled impact test data for the correlation and validation of the models, and Level 3-full-scale orbiter leading-edge impact tests to establish the final level of confidence for the analysis methodology. This report discusses the Level 2 test program conducted in the NASA Glenn Research Center (GRC) Ballistic Impact Laboratory with external tank foam impact tests on flat RCC panels, and presents the data observed. The Level 2 testing consisted of 54 impact tests in the NASA GRC Ballistic Impact Laboratory on 6- by 6-in. and 6- by 12-in. flat plates of RCC and evaluated two types of debris projectiles: BX-265 and PDL-1034 external tank foam. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile and validated the use of the foam and RCC models for use in LS-DYNA.

  17. Convergence of Health Level Seven Version 2 Messages to Semantic Web Technologies for Software-Intensive Systems in Telemedicine Trauma Care.

    PubMed

    Menezes, Pedro Monteiro; Cook, Timothy Wayne; Cavalini, Luciana Tricai

    2016-01-01

    To present the technical background and the development of a procedure that enriches the semantics of Health Level Seven version 2 (HL7v2) messages for software-intensive systems in telemedicine trauma care. This study followed a multilevel model-driven approach for the development of semantically interoperable health information systems. The Pre-Hospital Trauma Life Support (PHTLS) ABCDE protocol was adopted as the use case. A prototype application embedded the semantics into an HL7v2 message as an eXtensible Markup Language (XML) file, which was validated against an XML schema that defines constraints on a common reference model. This message was exchanged with a second prototype application, developed on the Mirth middleware, which was also used to parse and validate both the original and the hybrid messages. Both versions of the data instance (one pure XML, one embedded in the HL7v2 message) were equally validated and the RDF-based semantics recovered by the receiving side of the prototype from the shared XML schema. This study demonstrated the semantic enrichment of HL7v2 messages for intensive-software telemedicine systems for trauma care, by validating components of extracts generated in various computing environments. The adoption of the method proposed in this study ensures the compliance of the HL7v2 standard in Semantic Web technologies.

  18. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    NASA Astrophysics Data System (ADS)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  19. Infrared Imagery of Solid Rocket Exhaust Plumes

    NASA Technical Reports Server (NTRS)

    Moran, Robert P.; Houston, Janice D.

    2011-01-01

    The Ares I Scale Model Acoustic Test program consisted of a series of 18 solid rocket motor static firings, simulating the liftoff conditions of the Ares I five-segment Reusable Solid Rocket Motor Vehicle. Primary test objectives included acquiring acoustic and pressure data which will be used to validate analytical models for the prediction of Ares 1 liftoff acoustics and ignition overpressure environments. The test article consisted of a 5% scale Ares I vehicle and launch tower mounted on the Mobile Launch Pad. The testing also incorporated several Water Sound Suppression Systems. Infrared imagery was employed during the solid rocket testing to support the validation or improvement of analytical models, and identify corollaries between rocket plume size or shape and the accompanying measured level of noise suppression obtained by water sound suppression systems.

  20. Coupling of Bayesian Networks with GIS for wildfire risk assessment on natural and agricultural areas of the Mediterranean

    NASA Astrophysics Data System (ADS)

    Scherb, Anke; Papakosta, Panagiota; Straub, Daniel

    2014-05-01

    Wildfires cause severe damages to ecosystems, socio-economic assets, and human lives in the Mediterranean. To facilitate coping with wildfire risks, an understanding of the factors influencing wildfire occurrence and behavior (e.g. human activity, weather conditions, topography, fuel loads) and their interaction is of importance, as is the implementation of this knowledge in improved wildfire hazard and risk prediction systems. In this project, a probabilistic wildfire risk prediction model is developed, with integrated fire occurrence and fire propagation probability and potential impact prediction on natural and cultivated areas. Bayesian Networks (BNs) are used to facilitate the probabilistic modeling. The final BN model is a spatial-temporal prediction system at the meso scale (1 km2 spatial and 1 day temporal resolution). The modeled consequences account for potential restoration costs and production losses referred to forests, agriculture, and (semi-) natural areas. BNs and a geographic information system (GIS) are coupled within this project to support a semi-automated BN model parameter learning and the spatial-temporal risk prediction. The coupling also enables the visualization of prediction results by means of daily maps. The BN parameters are learnt for Cyprus with data from 2006-2009. Data from 2010 is used as validation data set. A special focus is put on the performance evaluation of the BN for fire occurrence, which is modeled as binary classifier and thus, could be validated by means of Receiver Operator Characteristic (ROC) curves. With the final best models, AUC values of more than 70% for validation could be achieved, which indicates potential for reliable prediction performance via BN. Maps of selected days in 2010 are shown to illustrate final prediction results. The resulting system can be easily expanded to predict additional expected damages in the mesoscale (e.g. building and infrastructure damages). The system can support planning of preventive measures (e.g. state resources allocation for wildfire prevention and preparedness) and assist recuperation plans of damaged areas.

  1. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions

    PubMed Central

    Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.

    2017-01-01

    Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569

  2. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.

  3. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-06-01

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  4. Solving the problem of building models of crosslinked polymers: an example focussing on validation of the properties of crosslinked epoxy resins.

    PubMed

    Hall, Stephen A; Howlin, Brendan J; Hamerton, Ian; Baidak, Alex; Billaud, Claude; Ward, Steven

    2012-01-01

    The construction of molecular models of crosslinked polymers is an area of some difficulty and considerable interest. We report here a new method of constructing these models and validate the method by modelling three epoxy systems based on the epoxy monomers bisphenol F diglycidyl ether (BFDGE) and triglycidyl-p-amino phenol (TGAP) with the curing agent diamino diphenyl sulphone (DDS). The main emphasis of the work concerns the improvement of the techniques for the molecular simulation of these epoxies and specific attention is paid towards model construction techniques, including automated model building and prediction of glass transition temperatures (T(g)). Typical models comprise some 4200-4600 atoms (ca. 120-130 monomers). In a parallel empirical study, these systems have been cast, cured and analysed by dynamic mechanical thermal analysis (DMTA) to measure T(g). Results for the three epoxy systems yield good agreement with experimental T(g) ranges of 200-220°C, 270-285°C and 285-290°C with corresponding simulated ranges of 210-230°C, 250-300°C, and 250-300°C respectively.

  5. Solving the Problem of Building Models of Crosslinked Polymers: An Example Focussing on Validation of the Properties of Crosslinked Epoxy Resins

    PubMed Central

    Hall, Stephen A.; Howlin, Brendan J; Hamerton, Ian; Baidak, Alex; Billaud, Claude; Ward, Steven

    2012-01-01

    The construction of molecular models of crosslinked polymers is an area of some difficulty and considerable interest. We report here a new method of constructing these models and validate the method by modelling three epoxy systems based on the epoxy monomers bisphenol F diglycidyl ether (BFDGE) and triglycidyl-p-amino phenol (TGAP) with the curing agent diamino diphenyl sulphone (DDS). The main emphasis of the work concerns the improvement of the techniques for the molecular simulation of these epoxies and specific attention is paid towards model construction techniques, including automated model building and prediction of glass transition temperatures (Tg). Typical models comprise some 4200–4600 atoms (ca. 120–130 monomers). In a parallel empirical study, these systems have been cast, cured and analysed by dynamic mechanical thermal analysis (DMTA) to measure Tg. Results for the three epoxy systems yield good agreement with experimental Tg ranges of 200–220°C, 270–285°C and 285–290°C with corresponding simulated ranges of 210–230°C, 250–300°C, and 250–300°C respectively. PMID:22916182

  6. Gas treatment in trickle-bed biofilters: biomass, how much is enough?

    PubMed

    Alonso, C; Suidan, M T; Sorial, G A; Smith, F L; Biswas, P; Smith, P J; Brenner, R C

    1997-06-20

    The objective of this article is to define and validate a mathematical model that desribes the physical and biological processes occurring in a trickle-bed air biofilter for waste gas treatment. This model considers a two-phase system, quasi-steady-state processes, uniform bacterial population, and one limiting substrate. The variation of the specific surface area with bacterial growth is included in the model, and its effect on the biofilter performance is analyzed. This analysis leads to the conclusion that excessive accumulation of biomass in the reactor has a negative effect on contaminant removal efficiency. To solve this problem, excess biomass is removed via full media fluidization and backwashing of the biofilter. The backwashing technique is also incorporated in the model as a process variable. Experimental data from the biodegradation of toluene in a pilot system with four packed-bed reactors are used to validate the model. Once the model is calibrated with the estimation of the unknown parameters of the system, it is used to simulate the biofilter performance for different operating conditions. Model predictions are found to be in agreement with experimental data. (c) 1997 John Wiley & Sons, Inc. Biotechnol Bioeng 54: 583-594, 1997.

  7. System for assessing Aviation's Global Emissions (SAGE). Version 1.5 : validation assessment, model assumptions and uncertainties

    DOT National Transportation Integrated Search

    2005-09-01

    The United States (US) Federal Aviation Administration (FAA) Office of Environment and Energy (AEE) has : developed the System for assessing Aviations Global Emissions (SAGE) with support from the Volpe National : Transportation Systems Center (Vo...

  8. A two-stage predictive model to simultaneous control of trihalomethanes in water treatment plants and distribution systems: adaptability to treatment processes.

    PubMed

    Domínguez-Tello, Antonio; Arias-Borrego, Ana; García-Barrera, Tamara; Gómez-Ariza, José Luis

    2017-10-01

    The trihalomethanes (TTHMs) and others disinfection by-products (DBPs) are formed in drinking water by the reaction of chlorine with organic precursors contained in the source water, in two consecutive and linked stages, that starts at the treatment plant and continues in second stage along the distribution system (DS) by reaction of residual chlorine with organic precursors not removed. Following this approach, this study aimed at developing a two-stage empirical model for predicting the formation of TTHMs in the water treatment plant and subsequently their evolution along the water distribution system (WDS). The aim of the two-stage model was to improve the predictive capability for a wide range of scenarios of water treatments and distribution systems. The two-stage model was developed using multiple regression analysis from a database (January 2007 to July 2012) using three different treatment processes (conventional and advanced) in the water supply system of Aljaraque area (southwest of Spain). Then, the new model was validated using a recent database from the same water supply system (January 2011 to May 2015). The validation results indicated no significant difference in the predictive and observed values of TTHM (R 2 0.874, analytical variance <17%). The new model was applied to three different supply systems with different treatment processes and different characteristics. Acceptable predictions were obtained in the three distribution systems studied, proving the adaptability of the new model to the boundary conditions. Finally the predictive capability of the new model was compared with 17 other models selected from the literature, showing satisfactory results prediction and excellent adaptability to treatment processes.

  9. The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.

  10. A method for landing gear modeling and simulation with experimental validation

    NASA Technical Reports Server (NTRS)

    Daniels, James N.

    1996-01-01

    This document presents an approach for modeling and simulating landing gear systems. Specifically, a nonlinear model of an A-6 Intruder Main Gear is developed, simulated, and validated against static and dynamic test data. This model includes nonlinear effects such as a polytropic gas model, velocity squared damping, a geometry governed model for the discharge coefficients, stick-slip friction effects and a nonlinear tire spring and damping model. An Adams-Moulton predictor corrector was used to integrate the equations of motion until a discontinuity caused by a stick-slip friction model was reached, at which point, a Runga-Kutta routine integrated past the discontinuity and returned the problem solution back to the predictor corrector. Run times of this software are around 2 mins. per 1 sec. of simulation under dynamic circumstances. To validate the model, engineers at the Aircraft Landing Dynamics facilities at NASA Langley Research Center installed one A-6 main gear on a drop carriage and used a hydraulic shaker table to provide simulated runway inputs to the gear. Model parameters were tuned to produce excellent agreement for many cases.

  11. Building a Decision Support System for Inpatient Admission Prediction With the Manchester Triage System and Administrative Check-in Variables.

    PubMed

    Zlotnik, Alexander; Alfaro, Miguel Cuchí; Pérez, María Carmen Pérez; Gallardo-Antolín, Ascensión; Martínez, Juan Manuel Montero

    2016-05-01

    The usage of decision support tools in emergency departments, based on predictive models, capable of estimating the probability of admission for patients in the emergency department may give nursing staff the possibility of allocating resources in advance. We present a methodology for developing and building one such system for a large specialized care hospital using a logistic regression and an artificial neural network model using nine routinely collected variables available right at the end of the triage process.A database of 255.668 triaged nonobstetric emergency department presentations from the Ramon y Cajal University Hospital of Madrid, from January 2011 to December 2012, was used to develop and test the models, with 66% of the data used for derivation and 34% for validation, with an ordered nonrandom partition. On the validation dataset areas under the receiver operating characteristic curve were 0.8568 (95% confidence interval, 0.8508-0.8583) for the logistic regression model and 0.8575 (95% confidence interval, 0.8540-0. 8610) for the artificial neural network model. χ Values for Hosmer-Lemeshow fixed "deciles of risk" were 65.32 for the logistic regression model and 17.28 for the artificial neural network model. A nomogram was generated upon the logistic regression model and an automated software decision support system with a Web interface was built based on the artificial neural network model.

  12. AIRS Retrieval Validation During the EAQUATE

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L.; Cuomo, Vincenzo; Taylor, Jonathan P.; Barnet, Christopher D.; DiGirolamo, Paolo; Pappalardo, Gelsomina; Larar, Allen M.; Liu, Xu; Newman, Stuart M.

    2006-01-01

    Atmospheric and surface thermodynamic parameters retrieved with advanced hyperspectral remote sensors of Earth observing satellites are critical for weather prediction and scientific research. The retrieval algorithms and retrieved parameters from satellite sounders must be validated to demonstrate the capability and accuracy of both observation and data processing systems. The European AQUA Thermodynamic Experiment (EAQUATE) was conducted mainly for validation of the Atmospheric InfraRed Sounder (AIRS) on the AQUA satellite, but also for assessment of validation systems of both ground-based and aircraft-based instruments which will be used for other satellite systems such as the Infrared Atmospheric Sounding Interferometer (IASI) on the European MetOp satellite, the Cross-track Infrared Sounder (CrIS) from the NPOESS Preparatory Project and the following NPOESS series of satellites. Detailed inter-comparisons were conducted and presented using different retrieval methodologies: measurements from airborne ultraspectral Fourier transform spectrometers, aircraft in-situ instruments, dedicated dropsondes and radiosondes, and ground based Raman Lidar, as well as from the European Center for Medium range Weather Forecasting (ECMWF) modeled thermal structures. The results of this study not only illustrate the quality of the measurements and retrieval products but also demonstrate the capability of these validation systems which are put in place to validate current and future hyperspectral sounding instruments and their scientific products.

  13. Critical evaluation of mechanistic two-phase flow pipeline and well simulation models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dhulesia, H.; Lopez, D.

    1996-12-31

    Mechanistic steady state simulation models, rather than empirical correlations, are used for a design of multiphase production system including well, pipeline and downstream installations. Among the available models, PEPITE, WELLSIM, OLGA, TACITE and TUFFP are widely used for this purpose and consequently, a critical evaluation of these models is needed. An extensive validation methodology is proposed which consists of two distinct steps: first to validate the hydrodynamic point model using the test loop data and, then to validate the over-all simulation model using the real pipelines and wells data. The test loop databank used in this analysis contains about 5952more » data sets originated from four different test loops and a majority of these data are obtained at high pressures (up to 90 bars) with real hydrocarbon fluids. Before performing the model evaluation, physical analysis of the test loops data is required to eliminate non-coherent data. The evaluation of these point models demonstrates that the TACITE and OLGA models can be applied to any configuration of pipes. The TACITE model performs better than the OLGA model because it uses the most appropriate closure laws from the literature validated on a large number of data. The comparison of predicted and measured pressure drop for various real pipelines and wells demonstrates that the TACITE model is a reliable tool.« less

  14. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  15. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  16. Modeling Liver-Related Adverse Effects of Drugs Using kNN QSAR Method

    PubMed Central

    Rodgers, Amie D.; Zhu, Hao; Fourches, Dennis; Rusyn, Ivan; Tropsha, Alexander

    2010-01-01

    Adverse effects of drugs (AEDs) continue to be a major cause of drug withdrawals both in development and post-marketing. While liver-related AEDs are a major concern for drug safety, there are few in silico models for predicting human liver toxicity for drug candidates. We have applied the Quantitative Structure Activity Relationship (QSAR) approach to model liver AEDs. In this study, we aimed to construct a QSAR model capable of binary classification (active vs. inactive) of drugs for liver AEDs based on chemical structure. To build QSAR models, we have employed an FDA spontaneous reporting database of human liver AEDs (elevations in activity of serum liver enzymes), which contains data on approximately 500 approved drugs. Approximately 200 compounds with wide clinical data coverage, structural similarity and balanced (40/60) active/inactive ratio were selected for modeling and divided into multiple training/test and external validation sets. QSAR models were developed using the k nearest neighbor method and validated using external datasets. Models with high sensitivity (>73%) and specificity (>94%) for prediction of liver AEDs in external validation sets were developed. To test applicability of the models, three chemical databases (World Drug Index, Prestwick Chemical Library, and Biowisdom Liver Intelligence Module) were screened in silico and the validity of predictions was determined, where possible, by comparing model-based classification with assertions in publicly available literature. Validated QSAR models of liver AEDs based on the data from the FDA spontaneous reporting system can be employed as sensitive and specific predictors of AEDs in pre-clinical screening of drug candidates for potential hepatotoxicity in humans. PMID:20192250

  17. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    NASA Technical Reports Server (NTRS)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  18. Anthropometric dependence of the response of a thorax FE model under high speed loading: validation and real world accident replication.

    PubMed

    Roth, Sébastien; Torres, Fabien; Feuerstein, Philippe; Thoral-Pierre, Karine

    2013-05-01

    Finite element analysis is frequently used in several fields such as automotive simulations or biomechanics. It helps researchers and engineers to understand the mechanical behaviour of complex structures. The development of computer science brought the possibility to develop realistic computational models which can behave like physical ones, avoiding the difficulties and costs of experimental tests. In the framework of biomechanics, lots of FE models have been developed in the last few decades, enabling the investigation of the behaviour of the human body submitted to heavy damage such as in road traffic accidents or in ballistic impact. In both cases, the thorax/abdomen/pelvis system is frequently injured. The understanding of the behaviour of this complex system is of extreme importance. In order to explore the dynamic response of this system to impact loading, a finite element model of the human thorax/abdomen/pelvis system has, therefore, been developed including the main organs: heart, lungs, kidneys, liver, spleen, the skeleton (with vertebrae, intervertebral discs, ribs), stomach, intestines, muscles, and skin. The FE model is based on a 3D reconstruction, which has been made from medical records of anonymous patients, who have had medical scans with no relation to the present study. Several scans have been analyzed, and specific attention has been paid to the anthropometry of the reconstructed model, which can be considered as a 50th percentile male model. The biometric parameters and laws have been implemented in the dynamic FE code (Radioss, Altair Hyperworks 11©) used for dynamic simulations. Then the 50th percentile model was validated against experimental data available in the literature, in terms of deflection, force, whose curve must be in experimental corridors. However, for other anthropometries (small male or large male models) question about the validation and results of numerical accident replications can be raised. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  20. Radiology Reporting System Data Exchange With the Electronic Health Record System: A Case Study in Iran.

    PubMed

    Ahmadi, Maryam; Ghazisaeidi, Marjan; Bashiri, Azadeh

    2015-03-18

    In order to better designing of electronic health record system in Iran, integration of health information systems based on a common language must be done to interpret and exchange this information with this system is required. This study provides a conceptual model of radiology reporting system using unified modeling language. The proposed model can solve the problem of integration this information system with the electronic health record system. By using this model and design its service based, easily connect to electronic health record in Iran and facilitate transfer radiology report data. This is a cross-sectional study that was conducted in 2013. The study population was 22 experts that working at the Imaging Center in Imam Khomeini Hospital in Tehran and the sample was accorded with the community. Research tool was a questionnaire that prepared by the researcher to determine the information requirements. Content validity and test-retest method was used to measure validity and reliability of questioner respectively. Data analyzed with average index, using SPSS. Also Visual Paradigm software was used to design a conceptual model. Based on the requirements assessment of experts and related texts, administrative, demographic and clinical data and radiological examination results and if the anesthesia procedure performed, anesthesia data suggested as minimum data set for radiology report and based it class diagram designed. Also by identifying radiology reporting system process, use case was drawn. According to the application of radiology reports in electronic health record system for diagnosing and managing of clinical problem of the patient, with providing the conceptual Model for radiology reporting system; in order to systematically design it, the problem of data sharing between these systems and electronic health records system would eliminate.

Top