Sample records for validated simulation model

  1. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  2. Flight Testing an Iced Business Jet for Flight Simulation Model Validation

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.; Barnhart, Billy P.; Lee, Sam; Cooper, Jon

    2007-01-01

    A flight test of a business jet aircraft with various ice accretions was performed to obtain data to validate flight simulation models developed through wind tunnel tests. Three types of ice accretions were tested: pre-activation roughness, runback shapes that form downstream of the thermal wing ice protection system, and a wing ice protection system failure shape. The high fidelity flight simulation models of this business jet aircraft were validated using a software tool called "Overdrive." Through comparisons of flight-extracted aerodynamic forces and moments to simulation-predicted forces and moments, the simulation models were successfully validated. Only minor adjustments in the simulation database were required to obtain adequate match, signifying the process used to develop the simulation models was successful. The simulation models were implemented in the NASA Ice Contamination Effects Flight Training Device (ICEFTD) to enable company pilots to evaluate flight characteristics of the simulation models. By and large, the pilots confirmed good similarities in the flight characteristics when compared to the real airplane. However, pilots noted pitch up tendencies at stall with the flaps extended that were not representative of the airplane and identified some differences in pilot forces. The elevator hinge moment model and implementation of the control forces on the ICEFTD were identified as a driver in the pitch ups and control force issues, and will be an area for future work.

  3. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  4. Modeling complex treatment strategies: construction and validation of a discrete event simulation model for glaucoma.

    PubMed

    van Gestel, Aukje; Severens, Johan L; Webers, Carroll A B; Beckers, Henny J M; Jansonius, Nomdo M; Schouten, Jan S A G

    2010-01-01

    Discrete event simulation (DES) modeling has several advantages over simpler modeling techniques in health economics, such as increased flexibility and the ability to model complex systems. Nevertheless, these benefits may come at the cost of reduced transparency, which may compromise the model's face validity and credibility. We aimed to produce a transparent report on the construction and validation of a DES model using a recently developed model of ocular hypertension and glaucoma. Current evidence of associations between prognostic factors and disease progression in ocular hypertension and glaucoma was translated into DES model elements. The model was extended to simulate treatment decisions and effects. Utility and costs were linked to disease status and treatment, and clinical and health economic outcomes were defined. The model was validated at several levels. The soundness of design and the plausibility of input estimates were evaluated in interdisciplinary meetings (face validity). Individual patients were traced throughout the simulation under a multitude of model settings to debug the model, and the model was run with a variety of extreme scenarios to compare the outcomes with prior expectations (internal validity). Finally, several intermediate (clinical) outcomes of the model were compared with those observed in experimental or observational studies (external validity) and the feasibility of evaluating hypothetical treatment strategies was tested. The model performed well in all validity tests. Analyses of hypothetical treatment strategies took about 30 minutes per cohort and lead to plausible health-economic outcomes. There is added value of DES models in complex treatment strategies such as glaucoma. Achieving transparency in model structure and outcomes may require some effort in reporting and validating the model, but it is feasible.

  5. Validated numerical simulation model of a dielectric elastomer generator

    NASA Astrophysics Data System (ADS)

    Foerster, Florentine; Moessinger, Holger; Schlaak, Helmut F.

    2013-04-01

    Dielectric elastomer generators (DEG) produce electrical energy by converting mechanical into electrical energy. Efficient operation requires homogeneous deformation of each single layer. However, by different internal and external influences like supports or the shape of a DEG the deformation will be inhomogeneous and hence negatively affect the amount of the generated electrical energy. Optimization of the deformation behavior leads to improved efficiency of the DEG and consequently to higher energy gain. In this work a numerical simulation model of a multilayer dielectric elastomer generator is developed using the FEM software ANSYS. The analyzed multilayer DEG consists of 49 active dielectric layers with layer thicknesses of 50 μm. The elastomer is silicone (PDMS) while the compliant electrodes are made of graphite powder. In the simulation the real material parameters of the PDMS and the graphite electrodes need to be included. Therefore, the mechanical and electrical material parameters of the PDMS are determined by experimental investigations of test samples while the electrode parameters are determined by numerical simulations of test samples. The numerical simulation of the DEG is carried out as coupled electro-mechanical simulation for the constant voltage energy harvesting cycle. Finally, the derived numerical simulation model is validated by comparison with analytical calculations and further simulated DEG configurations. The comparison of the determined results show good accordance with regard to the deformation of the DEG. Based on the validated model it is now possible to optimize the DEG layout for improved deformation behavior with further simulations.

  6. A method for landing gear modeling and simulation with experimental validation

    NASA Technical Reports Server (NTRS)

    Daniels, James N.

    1996-01-01

    This document presents an approach for modeling and simulating landing gear systems. Specifically, a nonlinear model of an A-6 Intruder Main Gear is developed, simulated, and validated against static and dynamic test data. This model includes nonlinear effects such as a polytropic gas model, velocity squared damping, a geometry governed model for the discharge coefficients, stick-slip friction effects and a nonlinear tire spring and damping model. An Adams-Moulton predictor corrector was used to integrate the equations of motion until a discontinuity caused by a stick-slip friction model was reached, at which point, a Runga-Kutta routine integrated past the discontinuity and returned the problem solution back to the predictor corrector. Run times of this software are around 2 mins. per 1 sec. of simulation under dynamic circumstances. To validate the model, engineers at the Aircraft Landing Dynamics facilities at NASA Langley Research Center installed one A-6 main gear on a drop carriage and used a hydraulic shaker table to provide simulated runway inputs to the gear. Model parameters were tuned to produce excellent agreement for many cases.

  7. Ventilation tube insertion simulation: a literature review and validity assessment of five training models.

    PubMed

    Mahalingam, S; Awad, Z; Tolley, N S; Khemani, S

    2016-08-01

    The objective of this study was to identify and investigate the face and content validity of ventilation tube insertion (VTI) training models described in the literature. A review of literature was carried out to identify articles describing VTI simulators. Feasible models were replicated and assessed by a group of experts. Postgraduate simulation centre. Experts were defined as surgeons who had performed at least 100 VTI on patients. Seventeen experts were participated ensuring sufficient statistical power for analysis. A standardised 18-item Likert-scale questionnaire was used. This addressed face validity (realism), global and task-specific content (suitability of the model for teaching) and curriculum recommendation. The search revealed eleven models, of which only five had associated validity data. Five models were found to be feasible to replicate. None of the tested models achieved face or global content validity. Only one model achieved task-specific validity, and hence, there was no agreement on curriculum recommendation. The quality of simulation models is moderate and there is room for improvement. There is a need for new models to be developed or existing ones to be refined in order to construct a more realistic training platform for VTI simulation. © 2015 John Wiley & Sons Ltd.

  8. Helicopter simulation validation using flight data

    NASA Technical Reports Server (NTRS)

    Key, D. L.; Hansen, R. S.; Cleveland, W. B.; Abbott, W. Y.

    1982-01-01

    A joint NASA/Army effort to perform a systematic ground-based piloted simulation validation assessment is described. The best available mathematical model for the subject helicopter (UH-60A Black Hawk) was programmed for real-time operation. Flight data were obtained to validate the math model, and to develop models for the pilot control strategy while performing mission-type tasks. The validated math model is to be combined with motion and visual systems to perform ground based simulation. Comparisons of the control strategy obtained in flight with that obtained on the simulator are to be used as the basis for assessing the fidelity of the results obtained in the simulator.

  9. Validation of chemistry models employed in a particle simulation method

    NASA Technical Reports Server (NTRS)

    Haas, Brian L.; Mcdonald, Jeffrey D.

    1991-01-01

    The chemistry models employed in a statistical particle simulation method, as implemented in the Intel iPSC/860 multiprocessor computer, are validated and applied. Chemical relaxation of five-species air in these reservoirs involves 34 simultaneous dissociation, recombination, and atomic-exchange reactions. The reaction rates employed in the analytic solutions are obtained from Arrhenius experimental correlations as functions of temperature for adiabatic gas reservoirs in thermal equilibrium. Favorable agreement with the analytic solutions validates the simulation when applied to relaxation of O2 toward equilibrium in reservoirs dominated by dissociation and recombination, respectively, and when applied to relaxation of air in the temperature range 5000 to 30,000 K. A flow of O2 over a circular cylinder at high Mach number is simulated to demonstrate application of the method to multidimensional reactive flows.

  10. Validation by simulation of a clinical trial model using the standardized mean and variance criteria.

    PubMed

    Abbas, Ismail; Rovira, Joan; Casanovas, Josep

    2006-12-01

    To develop and validate a model of a clinical trial that evaluates the changes in cholesterol level as a surrogate marker for lipodystrophy in HIV subjects under alternative antiretroviral regimes, i.e., treatment with Protease Inhibitors vs. a combination of nevirapine and other antiretroviral drugs. Five simulation models were developed based on different assumptions, on treatment variability and pattern of cholesterol reduction over time. The last recorded cholesterol level, the difference from the baseline, the average difference from the baseline and level evolution, are the considered endpoints. Specific validation criteria based on a 10% minus or plus standardized distance in means and variances were used to compare the real and the simulated data. The validity criterion was met by all models for considered endpoints. However, only two models met the validity criterion when all endpoints were considered. The model based on the assumption that within-subjects variability of cholesterol levels changes over time is the one that minimizes the validity criterion, standardized distance equal to or less than 1% minus or plus. Simulation is a useful technique for calibration, estimation, and evaluation of models, which allows us to relax the often overly restrictive assumptions regarding parameters required by analytical approaches. The validity criterion can also be used to select the preferred model for design optimization, until additional data are obtained allowing an external validation of the model.

  11. Simulation model calibration and validation : phase II : development of implementation handbook and short course.

    DOT National Transportation Integrated Search

    2006-01-01

    A previous study developed a procedure for microscopic simulation model calibration and validation and evaluated the procedure via two relatively simple case studies using three microscopic simulation models. Results showed that default parameters we...

  12. Simulation of laser beam reflection at the sea surface modeling and validation

    NASA Astrophysics Data System (ADS)

    Schwenger, Frédéric; Repasi, Endre

    2013-06-01

    A 3D simulation of the reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation is suitable for the pre-calculation of images for cameras operating in different spectral wavebands (visible, short wave infrared) for a bistatic configuration of laser source and receiver for different atmospheric conditions. In the visible waveband the calculated detected total power of reflected laser light from a 660nm laser source is compared with data collected in a field trial. Our computer simulation comprises the 3D simulation of a maritime scene (open sea/clear sky) and the simulation of laser beam reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. To predict the view of a camera the sea surface radiance must be calculated for the specific waveband. Additionally, the radiances of laser light specularly reflected at the wind-roughened sea surface are modeled considering an analytical statistical sea surface BRDF (bidirectional reflectance distribution function). Validation of simulation results is prerequisite before applying the computer simulation to maritime laser applications. For validation purposes data (images and meteorological data) were selected from field measurements, using a 660nm cw-laser diode to produce laser beam reflection at the water surface and recording images by a TV camera. The validation is done by numerical comparison of measured total laser power extracted from recorded images with the corresponding simulation results. The results of the comparison are presented for different incident (zenith/azimuth) angles of the laser beam.

  13. Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner

    NASA Astrophysics Data System (ADS)

    Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.

    2015-02-01

    Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.

  14. Validating clustering of molecular dynamics simulations using polymer models.

    PubMed

    Phillips, Joshua L; Colvin, Michael E; Newsam, Shawn

    2011-11-14

    Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the first to utilize model polymers

  15. Validating clustering of molecular dynamics simulations using polymer models

    PubMed Central

    2011-01-01

    Background Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. Results We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. Conclusions We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the

  16. Simulated training in colonoscopic stenting of colonic strictures: validation of a cadaver model.

    PubMed

    Iordache, F; Bucobo, J C; Devlin, D; You, K; Bergamaschi, R

    2015-07-01

    There are currently no available simulation models for training in colonoscopic stent deployment. The aim of this study was to validate a cadaver model for simulation training in colonoscopy with stent deployment for colonic strictures. This was a prospective study enrolling surgeons at a single institution. Participants performed colonoscopic stenting on a cadaver model. Their performance was assessed by two independent observers. Measurements were performed for quantitative analysis (time to identify stenosis, time for deployment, accuracy) and a weighted score was devised for assessment. The Mann-Whitney U-test and Student's t-test were used for nonparametric and parametric data, respectively. Cohen's kappa coefficient was used for reliability. Twenty participants performed a colonoscopy with deployment of a self-expandable metallic stent in two cadavers (groups A and B) with 20 strictures overall. The median time was 206 s. The model was able to differentiate between experts and novices (P = 0. 013). The results showed a good consensus estimate of reliability, with kappa = 0.571 (P < 0.0001). The cadaver model described in this study has content, construct and concurrent validity for simulation training in colonoscopic deployment of self-expandable stents for colonic strictures. Further studies are needed to evaluate the predictive validity of this model in terms of skill transfer to clinical practice. Colorectal Disease © 2014 The Association of Coloproctology of Great Britain and Ireland.

  17. Object-oriented simulation model of a parabolic trough solar collector: Static and dynamic validation

    NASA Astrophysics Data System (ADS)

    Ubieta, Eduardo; Hoyo, Itzal del; Valenzuela, Loreto; Lopez-Martín, Rafael; Peña, Víctor de la; López, Susana

    2017-06-01

    A simulation model of a parabolic-trough solar collector developed in Modelica® language is calibrated and validated. The calibration is performed in order to approximate the behavior of the solar collector model to a real one due to the uncertainty in some of the system parameters, i.e. measured data is used during the calibration process. Afterwards, the validation of this calibrated model is done. During the validation, the results obtained from the model are compared to the ones obtained during real operation in a collector from the Plataforma Solar de Almeria (PSA).

  18. A validation procedure for a LADAR system radiometric simulation model

    NASA Astrophysics Data System (ADS)

    Leishman, Brad; Budge, Scott; Pack, Robert

    2007-04-01

    The USU LadarSIM software package is a ladar system engineering tool that has recently been enhanced to include the modeling of the radiometry of Ladar beam footprints. This paper will discuss our validation of the radiometric model and present a practical approach to future validation work. In order to validate complicated and interrelated factors affecting radiometry, a systematic approach had to be developed. Data for known parameters were first gathered then unknown parameters of the system were determined from simulation test scenarios. This was done in a way to isolate as many unknown variables as possible, then build on the previously obtained results. First, the appropriate voltage threshold levels of the discrimination electronics were set by analyzing the number of false alarms seen in actual data sets. With this threshold set, the system noise was then adjusted to achieve the appropriate number of dropouts. Once a suitable noise level was found, the range errors of the simulated and actual data sets were compared and studied. Predicted errors in range measurements were analyzed using two methods: first by examining the range error of a surface with known reflectivity and second by examining the range errors for specific detectors with known responsivities. This provided insight into the discrimination method and receiver electronics used in the actual system.

  19. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    NASA Astrophysics Data System (ADS)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-12-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  20. A systematic review of validated sinus surgery simulators.

    PubMed

    Stew, B; Kao, S S-T; Dharmawardana, N; Ooi, E H

    2018-06-01

    Simulation provides a safe and effective opportunity to develop surgical skills. A variety of endoscopic sinus surgery (ESS) simulators has been described in the literature. Validation of these simulators allows for effective utilisation in training. To conduct a systematic review of the published literature to analyse the evidence for validated ESS simulation. Pubmed, Embase, Cochrane and Cinahl were searched from inception of the databases to 11 January 2017. Twelve thousand five hundred and sixteen articles were retrieved of which 10 112 were screened following the removal of duplicates. Thirty-eight full-text articles were reviewed after meeting search criteria. Evidence of face, content, construct, discriminant and predictive validity was extracted. Twenty articles were included in the analysis describing 12 ESS simulators. Eleven of these simulators had undergone validation: 3 virtual reality, 7 physical bench models and 1 cadaveric simulator. Seven of the simulators were shown to have face validity, 7 had construct validity and 1 had predictive validity. None of the simulators demonstrated discriminate validity. This systematic review demonstrates that a number of ESS simulators have been comprehensively validated. Many of the validation processes, however, lack standardisation in outcome reporting, thus limiting a meta-analysis comparison between simulators. © 2017 John Wiley & Sons Ltd.

  1. A Framework for Validating Traffic Simulation Models at the Vehicle Trajectory Level

    DOT National Transportation Integrated Search

    2017-03-01

    Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...

  2. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    PubMed

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  3. Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software

    NASA Astrophysics Data System (ADS)

    Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.

    2017-09-01

    This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.

  4. Challenges of forest landscape modeling - simulating large landscapes and validating results

    Treesearch

    Hong S. He; Jian Yang; Stephen R. Shifley; Frank R. Thompson

    2011-01-01

    Over the last 20 years, we have seen a rapid development in the field of forest landscape modeling, fueled by both technological and theoretical advances. Two fundamental challenges have persisted since the inception of FLMs: (1) balancing realistic simulation of ecological processes at broad spatial and temporal scales with computing capacity, and (2) validating...

  5. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations

    PubMed Central

    Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.

    2017-01-01

    A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and

  6. Nonsequential modeling of laser diode stacks using Zemax: simulation, optimization, and experimental validation.

    PubMed

    Coluccelli, Nicola

    2010-08-01

    Modeling a real laser diode stack based on Zemax ray tracing software that operates in a nonsequential mode is reported. The implementation of the model is presented together with the geometric and optical parameters to be adjusted to calibrate the model and to match the simulated intensity irradiance profiles with the experimental profiles. The calibration of the model is based on a near-field and a far-field measurement. The validation of the model has been accomplished by comparing the simulated and experimental transverse irradiance profiles at different positions along the caustic formed by a lens. Spot sizes and waist location are predicted with a maximum error below 6%.

  7. Semi-physiologic model validation and bioequivalence trials simulation to select the best analyte for acetylsalicylic acid.

    PubMed

    Cuesta-Gragera, Ana; Navarro-Fontestad, Carmen; Mangas-Sanjuan, Victor; González-Álvarez, Isabel; García-Arieta, Alfredo; Trocóniz, Iñaki F; Casabó, Vicente G; Bermejo, Marival

    2015-07-10

    The objective of this paper is to apply a previously developed semi-physiologic pharmacokinetic model implemented in NONMEM to simulate bioequivalence trials (BE) of acetyl salicylic acid (ASA) in order to validate the model performance against ASA human experimental data. ASA is a drug with first-pass hepatic and intestinal metabolism following Michaelis-Menten kinetics that leads to the formation of two main metabolites in two generations (first and second generation metabolites). The first aim was to adapt the semi-physiological model for ASA in NOMMEN using ASA pharmacokinetic parameters from literature, showing its sequential metabolism. The second aim was to validate this model by comparing the results obtained in NONMEM simulations with published experimental data at a dose of 1000 mg. The validated model was used to simulate bioequivalence trials at 3 dose schemes (100, 1000 and 3000 mg) and with 6 test formulations with decreasing in vivo dissolution rate constants versus the reference formulation (kD 8-0.25 h (-1)). Finally, the third aim was to determine which analyte (parent drug, first generation or second generation metabolite) was more sensitive to changes in formulation performance. The validation results showed that the concentration-time curves obtained with the simulations reproduced closely the published experimental data, confirming model performance. The parent drug (ASA) was the analyte that showed to be more sensitive to the decrease in pharmaceutical quality, with the highest decrease in Cmax and AUC ratio between test and reference formulations. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Proof of Concept for the Trajectory-Level Validation Framework for Traffic Simulation Models

    DOT National Transportation Integrated Search

    2017-10-30

    Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...

  9. Quantitative validation of carbon-fiber laminate low velocity impact simulations

    DOE PAGES

    English, Shawn A.; Briggs, Timothy M.; Nelson, Stacy M.

    2015-09-26

    Simulations of low velocity impact with a flat cylindrical indenter upon a carbon fiber fabric reinforced polymer laminate are rigorously validated. Comparison of the impact energy absorption between the model and experiment is used as the validation metric. Additionally, non-destructive evaluation, including ultrasonic scans and three-dimensional computed tomography, provide qualitative validation of the models. The simulations include delamination, matrix cracks and fiber breaks. An orthotropic damage and failure constitutive model, capable of predicting progressive damage and failure, is developed in conjunction and described. An ensemble of simulations incorporating model parameter uncertainties is used to predict a response distribution which ismore » then compared to experimental output using appropriate statistical methods. Lastly, the model form errors are exposed and corrected for use in an additional blind validation analysis. The result is a quantifiable confidence in material characterization and model physics when simulating low velocity impact in structures of interest.« less

  10. Coarse Grained Model for Biological Simulations: Recent Refinements and Validation

    PubMed Central

    Vicatos, Spyridon; Rychkova, Anna; Mukherjee, Shayantani; Warshel, Arieh

    2014-01-01

    Exploring the free energy landscape of proteins and modeling the corresponding functional aspects presents a major challenge for computer simulation approaches. This challenge is due to the complexity of the landscape and the enormous computer time needed for converging simulations. The use of various simplified coarse grained (CG) models offers an effective way of sampling the landscape, but most current models are not expected to give a reliable description of protein stability and functional aspects. The main problem is associated with insufficient focus on the electrostatic features of the model. In this respect our recent CG model offers significant advantage as it has been refined while focusing on its electrostatic free energy. Here we review the current state of our model, describing recent refinement, extensions and validation studies while focusing on demonstrating key applications. These include studies of protein stability, extending the model to include membranes and electrolytes and electrodes as well as studies of voltage activated proteins, protein insertion trough the translocon, the action of molecular motors and even the coupling of the stalled ribosome and the translocon. Our example illustrates the general potential of our approach in overcoming major challenges in studies of structure function correlation in proteins and large macromolecular complexes. PMID:25050439

  11. Simulation validation of the XV-15 tilt-rotor research aircraft

    NASA Technical Reports Server (NTRS)

    Ferguson, S. W.; Hanson, G. D.; Churchill, G. B.

    1984-01-01

    The results of a simulation validation program of the XV-15 tilt-rotor research aircraft are detailed, covering such simulation aspects as the mathematical model, visual system, motion system, cab aural system, cab control loader system, pilot perceptual fidelity, and generic tilt rotor applications. Simulation validation was performed for the hover, low-speed, and sideward flight modes, with consideration of the in-ground rotor effect. Several deficiencies of the mathematical model and the simulation systems were identified in the course of the simulation validation project, and some were corrected. It is noted that NASA's Vertical Motion Simulator used in the program is an excellent tool for tilt-rotor and rotorcraft design, development, and pilot training.

  12. MATLAB/Simulink Pulse-Echo Ultrasound System Simulator Based on Experimentally Validated Models.

    PubMed

    Kim, Taehoon; Shin, Sangmin; Lee, Hyongmin; Lee, Hyunsook; Kim, Heewon; Shin, Eunhee; Kim, Suhwan

    2016-02-01

    A flexible clinical ultrasound system must operate with different transducers, which have characteristic impulse responses and widely varying impedances. The impulse response determines the shape of the high-voltage pulse that is transmitted and the specifications of the front-end electronics that receive the echo; the impedance determines the specification of the matching network through which the transducer is connected. System-level optimization of these subsystems requires accurate modeling of pulse-echo (two-way) response, which in turn demands a unified simulation of the ultrasonics and electronics. In this paper, this is realized by combining MATLAB/Simulink models of the high-voltage transmitter, the transmission interface, the acoustic subsystem which includes wave propagation and reflection, the receiving interface, and the front-end receiver. To demonstrate the effectiveness of our simulator, the models are experimentally validated by comparing the simulation results with the measured data from a commercial ultrasound system. This simulator could be used to quickly provide system-level feedback for an optimized tuning of electronic design parameters.

  13. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  14. Iced Aircraft Flight Data for Flight Simulator Validation

    NASA Technical Reports Server (NTRS)

    Ratvasky, Thomas P.; Blankenship, Kurt; Rieke, William; Brinker, David J.

    2003-01-01

    NASA is developing and validating technology to incorporate aircraft icing effects into a flight training device concept demonstrator. Flight simulation models of a DHC-6 Twin Otter were developed from wind tunnel data using a subscale, complete aircraft model with and without simulated ice, and from previously acquired flight data. The validation of the simulation models required additional aircraft response time histories of the airplane configured with simulated ice similar to the subscale model testing. Therefore, a flight test was conducted using the NASA Twin Otter Icing Research Aircraft. Over 500 maneuvers of various types were conducted in this flight test. The validation data consisted of aircraft state parameters, pilot inputs, propulsion, weight, center of gravity, and moments of inertia with the airplane configured with different amounts of simulated ice. Emphasis was made to acquire data at wing stall and tailplane stall since these events are of primary interest to model accurately in the flight training device. Analyses of several datasets are described regarding wing and tailplane stall. Key findings from these analyses are that the simulated wing ice shapes significantly reduced the C , max, while the simulated tail ice caused elevator control force anomalies and tailplane stall when flaps were deflected 30 deg or greater. This effectively reduced the safe operating margins between iced wing and iced tail stall as flap deflection and thrust were increased. This flight test demonstrated that the critical aspects to be modeled in the icing effects flight training device include: iced wing and tail stall speeds, flap and thrust effects, control forces, and control effectiveness.

  15. Tyre tread-block friction: modelling, simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Wallaschek, Jörg; Wies, Burkard

    2013-07-01

    Pneumatic tyres are used in vehicles since the beginning of the last century. They generate braking and steering forces for bicycles, motor cycles, cars, busses, trucks, agricultural vehicles and aircraft. These forces are generated in the usually very small contact area between tyre and road and their performance characteristics are of eminent importance for safety and comfort. Much research has been addressed to optimise tyre design with respect to footprint pressure and friction. In this context, the development of virtual tyre prototypes, that is, simulation models for the tyre, has grown to a science in its own. While the modelling of the structural dynamics of the tyre has reached a very advanced level, which allows to take into account effects like the rate-independent inelasticity of filled elastomers or the transient 3D deformations of the ply-reinforced tread, shoulder and sidewalls, little is known about the friction between tread-block elements and road. This is particularly obvious in the case when snow, ice, water or a third-body layer are present in the tyre-road contact. In the present paper, we give a survey on the present state of knowledge in the modelling, simulation and experimental validation of tyre tread-block friction processes. We concentrate on experimental techniques.

  16. Calibration and validation of a spar-type floating offshore wind turbine model using the FAST dynamic simulation tool

    DOE PAGES

    Browning, J. R.; Jonkman, J.; Robertson, A.; ...

    2014-12-16

    In this study, high-quality computer simulations are required when designing floating wind turbines because of the complex dynamic responses that are inherent with a high number of degrees of freedom and variable metocean conditions. In 2007, the FAST wind turbine simulation tool, developed and maintained by the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL), was expanded to include capabilities that are suitable for modeling floating offshore wind turbines. In an effort to validate FAST and other offshore wind energy modeling tools, DOE funded the DeepCwind project that tested three prototype floating wind turbines at 1/50 th scalemore » in a wave basin, including a semisubmersible, a tension-leg platform, and a spar buoy. This paper describes the use of the results of the spar wave basin tests to calibrate and validate the FAST offshore floating simulation tool, and presents some initial results of simulated dynamic responses of the spar to several combinations of wind and sea states. Wave basin tests with the spar attached to a scale model of the NREL 5-megawatt reference wind turbine were performed at the Maritime Research Institute Netherlands under the DeepCwind project. This project included free-decay tests, tests with steady or turbulent wind and still water (both periodic and irregular waves with no wind), and combined wind/wave tests. The resulting data from the 1/50th model was scaled using Froude scaling to full size and used to calibrate and validate a full-size simulated model in FAST. Results of the model calibration and validation include successes, subtleties, and limitations of both wave basin testing and FAST modeling capabilities.« less

  17. Validation of a Simulation Model of Intrinsic Lutetium-176 Activity in LSO-Based Preclinical PET Systems

    NASA Astrophysics Data System (ADS)

    McIntosh, Bryan

    The LSO scintillator crystal commonly used in PET scanners contains a low level of intrinsic radioactivity due to a small amount of Lu-176. This is not usually a concern in routine scanning but can become an issue in small animal imaging, especially when imaging low tracer activity levels. Previously there had been no systematic validation of simulations of this activity; this thesis discusses the validation of a GATE model of intrinsic Lu-176 against results from a bench-top pair of detectors and a Siemens Inveon preclinical PET system. The simulation results matched those from the bench-top system very well, but did not agree as well with results from the complete Inveon system due to a drop-off in system sensitivity at low energies that was not modelled. With this validation the model can now be used with confidence to predict the effects of Lu-176 activity in future PET systems.

  18. Validation of the BASALT model for simulating off-axis hydrothermal circulation in oceanic crust

    NASA Astrophysics Data System (ADS)

    Farahat, Navah X.; Archer, David; Abbot, Dorian S.

    2017-08-01

    Fluid recharge and discharge between the deep ocean and the porous upper layer of off-axis oceanic crust tends to concentrate in small volumes of rock, such as seamounts and fractures, that are unimpeded by low-permeability sediments. Basement structure, sediment burial, heat flow, and other regional characteristics of off-axis hydrothermal systems appear to produce considerable diversity of circulation behaviors. Circulation of seawater and seawater-derived fluids controls the extent of fluid-rock interaction, resulting in significant geochemical impacts. However, the primary regional characteristics that control how seawater is distributed within upper oceanic crust are still poorly understood. In this paper we present the details of the two-dimensional (2-D) BASALT (Basement Activity Simulated At Low Temperatures) numerical model of heat and fluid transport in an off-axis hydrothermal system. This model is designed to simulate a wide range of conditions in order to explore the dominant controls on circulation. We validate the BASALT model's ability to reproduce observations by configuring it to represent a thoroughly studied transect of the Juan de Fuca Ridge eastern flank. The results demonstrate that including series of narrow, ridge-parallel fractures as subgrid features produces a realistic circulation scenario at the validation site. In future projects, a full reactive transport version of the validated BASALT model will be used to explore geochemical fluxes in a variety of off-axis hydrothermal environments.

  19. Numerical Modelling of Femur Fracture and Experimental Validation Using Bone Simulant.

    PubMed

    Marco, Miguel; Giner, Eugenio; Larraínzar-Garijo, Ricardo; Caeiro, José Ramón; Miguélez, María Henar

    2017-10-01

    Bone fracture pattern prediction is still a challenge and an active field of research. The main goal of this article is to present a combined methodology (experimental and numerical) for femur fracture onset analysis. Experimental work includes the characterization of the mechanical properties and fracture testing on a bone simulant. The numerical work focuses on the development of a model whose material properties are provided by the characterization tests. The fracture location and the early stages of the crack propagation are modelled using the extended finite element method and the model is validated by fracture tests developed in the experimental work. It is shown that the accuracy of the numerical results strongly depends on a proper bone behaviour characterization.

  20. Simulator validation results and proposed reporting format from flight testing a software model of a complex, high-performance airplane.

    DOT National Transportation Integrated Search

    2008-01-01

    Computer simulations are often used in aviation studies. These simulation tools may require complex, high-fidelity aircraft models. Since many of the flight models used are third-party developed products, independent validation is desired prior to im...

  1. Impact of model development, calibration and validation decisions on hydrological simulations in West Lake Erie Basin

    USDA-ARS?s Scientific Manuscript database

    Watershed simulation models are used extensively to investigate hydrologic processes, landuse and climate change impacts, pollutant load assessments and best management practices (BMPs). Developing, calibrating and validating these models require a number of critical decisions that will influence t...

  2. PSI-Center Simulations of Validation Platform Experiments

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2013-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with extended MHD simulations. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), FRX-L (Los Alamos National Laboratory), HIT-SI (U Wash - UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), PHD/ELF (UW/MSNW), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). Modifications have been made to the NIMROD, HiFi, and PSI-Tet codes to specifically model these experiments, including mesh generation/refinement, non-local closures, appropriate boundary conditions (external fields, insulating BCs, etc.), and kinetic and neutral particle interactions. The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition is proving to be a powerful method to compare global temporal and spatial structures for validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  3. Control Performance, Aerodynamic Modeling, and Validation of Coupled Simulation Techniques for Guided Projectile Roll Dynamics

    DTIC Science & Technology

    2014-11-01

    39–44) has been explored in depth in the literature. Of particular interest for this study are investigations into roll control. Isolating the...Control Performance, Aerodynamic Modeling, and Validation of Coupled Simulation Techniques for Guided Projectile Roll Dynamics by Jubaraj...Simulation Techniques for Guided Projectile Roll Dynamics Jubaraj Sahu, Frank Fresconi, and Karen R. Heavey Weapons and Materials Research

  4. Virtual evaluation of stent graft deployment: a validated modeling and simulation study.

    PubMed

    De Bock, S; Iannaccone, F; De Santis, G; De Beule, M; Van Loo, D; Devos, D; Vermassen, F; Segers, P; Verhegghe, B

    2012-09-01

    The presented study details the virtual deployment of a bifurcated stent graft (Medtronic Talent) in an Abdominal Aortic Aneurysm model, using the finite element method. The entire deployment procedure is modeled, with the stent graft being crimped and bent according to the vessel geometry, and subsequently released. The finite element results are validated in vitro with placement of the device in a silicone mock aneurysm, using high resolution CT scans to evaluate the result. The presented work confirms the capability of finite element computer simulations to predict the deformed configuration after endovascular aneurysm repair (EVAR). These simulations can be used to quantify mechanical parameters, such as neck dilations, radial forces and stresses in the device, that are difficult or impossible to obtain from medical imaging. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)

    NASA Astrophysics Data System (ADS)

    Kropp, Derek L.

    2009-05-01

    One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.

  6. Experimental validation of numerical simulations on a cerebral aneurysm phantom model

    PubMed Central

    Seshadhri, Santhosh; Janiga, Gábor; Skalej, Martin; Thévenin, Dominique

    2012-01-01

    The treatment of cerebral aneurysms, found in roughly 5% of the population and associated in case of rupture to a high mortality rate, is a major challenge for neurosurgery and neuroradiology due to the complexity of the intervention and to the resulting, high hazard ratio. Improvements are possible but require a better understanding of the associated, unsteady blood flow patterns in complex 3D geometries. It would be very useful to carry out such studies using suitable numerical models, if it is proven that they reproduce accurately enough the real conditions. This validation step is classically based on comparisons with measured data. Since in vivo measurements are extremely difficult and therefore of limited accuracy, complementary model-based investigations considering realistic configurations are essential. In the present study, simulations based on computational fluid dynamics (CFD) have been compared with in situ, laser-Doppler velocimetry (LDV) measurements in the phantom model of a cerebral aneurysm. The employed 1:1 model is made from transparent silicone. A liquid mixture composed of water, glycerin, xanthan gum and sodium chloride has been specifically adapted for the present investigation. It shows physical flow properties similar to real blood and leads to a refraction index perfectly matched to that of the silicone model, allowing accurate optical measurements of the flow velocity. For both experiments and simulations, complex pulsatile flow waveforms and flow rates were accounted for. This finally allows a direct, quantitative comparison between measurements and simulations. In this manner, the accuracy of the employed computational model can be checked. PMID:24265876

  7. Implications of Simulation Conceptual Model Development for Simulation Management and Uncertainty Assessment

    NASA Technical Reports Server (NTRS)

    Pace, Dale K.

    2000-01-01

    A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.

  8. Developing R&D portfolio business validity simulation model and system.

    PubMed

    Yeo, Hyun Jin; Im, Kwang Hyuk

    2015-01-01

    The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker's burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry's R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator's business validity work in each evaluation module by integrate to one screen.

  9. Gathering Validity Evidence for Surgical Simulation: A Systematic Review.

    PubMed

    Borgersen, Nanna Jo; Naur, Therese M H; Sørensen, Stine M D; Bjerrum, Flemming; Konge, Lars; Subhi, Yousif; Thomsen, Ann Sofia S

    2018-06-01

    To identify current trends in the use of validity frameworks in surgical simulation, to provide an overview of the evidence behind the assessment of technical skills in all surgical specialties, and to present recommendations and guidelines for future validity studies. Validity evidence for assessment tools used in the evaluation of surgical performance is of paramount importance to ensure valid and reliable assessment of skills. We systematically reviewed the literature by searching 5 databases (PubMed, EMBASE, Web of Science, PsycINFO, and the Cochrane Library) for studies published from January 1, 2008, to July 10, 2017. We included original studies evaluating simulation-based assessments of health professionals in surgical specialties and extracted data on surgical specialty, simulator modality, participant characteristics, and the validity framework used. Data were synthesized qualitatively. We identified 498 studies with a total of 18,312 participants. Publications involving validity assessments in surgical simulation more than doubled from 2008 to 2010 (∼30 studies/year) to 2014 to 2016 (∼70 to 90 studies/year). Only 6.6% of the studies used the recommended contemporary validity framework (Messick). The majority of studies used outdated frameworks such as face validity. Significant differences were identified across surgical specialties. The evaluated assessment tools were mostly inanimate or virtual reality simulation models. An increasing number of studies have gathered validity evidence for simulation-based assessments in surgical specialties, but the use of outdated frameworks remains common. To address the current practice, this paper presents guidelines on how to use the contemporary validity framework when designing validity studies.

  10. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    NASA Astrophysics Data System (ADS)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  11. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  12. Simulators' validation study: Problem solution logic

    NASA Technical Reports Server (NTRS)

    Schoultz, M. B.

    1974-01-01

    A study was conducted to validate the ground based simulators used for aircraft environment in ride-quality research. The logic to the approach for solving this problem is developed. The overall problem solution flow chart is presented. The factors which could influence the human response to the environment on board the aircraft are analyzed. The mathematical models used in the study are explained. The steps which were followed in conducting the validation tests are outlined.

  13. Turbine-99 unsteady simulations - Validation

    NASA Astrophysics Data System (ADS)

    Cervantes, M. J.; Andersson, U.; Lövgren, H. M.

    2010-08-01

    The Turbine-99 test case, a Kaplan draft tube model, aimed to determine the state of the art within draft tube simulation. Three workshops were organized on the matter in 1999, 2001 and 2005 where the geometry and experimental data were provided as boundary conditions to the participants. Since the last workshop, computational power and flow modelling have been developed and the available data completed with unsteady pressure measurements and phase resolved velocity measurements in the cone. Such new set of data together with the corresponding phase resolved velocity boundary conditions offer new possibilities to validate unsteady numerical simulations in Kaplan draft tube. The present work presents simulation of the Turbine-99 test case with time dependent angular resolved inlet velocity boundary conditions. Different grids and time steps are investigated. The results are compared to experimental time dependent pressure and velocity measurements.

  14. Simulation verification techniques study. Subsystem simulation validation techniques

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1974-01-01

    Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.

  15. Early Validation of Failure Detection, Isolation, and Recovery Design Using Heterogeneous Modelling and Simulation

    NASA Astrophysics Data System (ADS)

    van der Plas, Peter; Guerriero, Suzanne; Cristiano, Leorato; Rugina, Ana

    2012-08-01

    Modelling and simulation can support a number of use cases across the spacecraft development life-cycle. Given the increasing complexity of space missions, the observed general trend is for a more extensive usage of simulation already in the early phases. A major perceived advantage is that modelling and simulation can enable the validation of critical aspects of the spacecraft design before the actual development is started, as such reducing the risk in later phases.Failure Detection, Isolation, and Recovery (FDIR) is one of the areas with a high potential to benefit from early modelling and simulation. With the increasing level of required spacecraft autonomy, FDIR specifications can grow in such a way that the traditional document-based review process soon becomes inadequate.This paper shows that FDIR modelling and simulation in a system context can provide a powerful tool to support the FDIR verification process. It is highlighted that FDIR modelling at this early stage requires heterogeneous modelling tools and languages, in order to provide an adequate functional description of the different components (i.e. FDIR functions, environment, equipment, etc.) to be modelled.For this reason, an FDIR simulation framework is proposed in this paper. This framework is based on a number of tools already available in the Avionics Systems Laboratory at ESTEC, which are the Avionics Test Bench Functional Engineering Simulator (ATB FES), Matlab/Simulink, TASTE, and Real Time Developer Studio (RTDS).The paper then discusses the application of the proposed simulation framework to a real case-study, i.e. the FDIR modelling of a satellite in support of actual ESA mission. Challenges and benefits of the approach are described. Finally, lessons learned and the generality of the proposed approach are discussed.

  16. Validating the simulation of large-scale parallel applications using statistical characteristics

    DOE PAGES

    Zhang, Deli; Wilke, Jeremiah; Hendry, Gilbert; ...

    2016-03-01

    Simulation is a widely adopted method to analyze and predict the performance of large-scale parallel applications. Validating the hardware model is highly important for complex simulations with a large number of parameters. Common practice involves calculating the percent error between the projected and the real execution time of a benchmark program. However, in a high-dimensional parameter space, this coarse-grained approach often suffers from parameter insensitivity, which may not be known a priori. Moreover, the traditional approach cannot be applied to the validation of software models, such as application skeletons used in online simulations. In this work, we present a methodologymore » and a toolset for validating both hardware and software models by quantitatively comparing fine-grained statistical characteristics obtained from execution traces. Although statistical information has been used in tasks like performance optimization, this is the first attempt to apply it to simulation validation. Lastly, our experimental results show that the proposed evaluation approach offers significant improvement in fidelity when compared to evaluation using total execution time, and the proposed metrics serve as reliable criteria that progress toward automating the simulation tuning process.« less

  17. SDG and qualitative trend based model multiple scale validation

    NASA Astrophysics Data System (ADS)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  18. Modeling Clinical Outcomes in Prostate Cancer: Application and Validation of the Discrete Event Simulation Approach.

    PubMed

    Pan, Feng; Reifsnider, Odette; Zheng, Ying; Proskorovsky, Irina; Li, Tracy; He, Jianming; Sorensen, Sonja V

    2018-04-01

    Treatment landscape in prostate cancer has changed dramatically with the emergence of new medicines in the past few years. The traditional survival partition model (SPM) cannot accurately predict long-term clinical outcomes because it is limited by its ability to capture the key consequences associated with this changing treatment paradigm. The objective of this study was to introduce and validate a discrete-event simulation (DES) model for prostate cancer. A DES model was developed to simulate overall survival (OS) and other clinical outcomes based on patient characteristics, treatment received, and disease progression history. We tested and validated this model with clinical trial data from the abiraterone acetate phase III trial (COU-AA-302). The model was constructed with interim data (55% death) and validated with the final data (96% death). Predicted OS values were also compared with those from the SPM. The DES model's predicted time to chemotherapy and OS are highly consistent with the final observed data. The model accurately predicts the OS hazard ratio from the final data cut (predicted: 0.74; 95% confidence interval [CI] 0.64-0.85 and final actual: 0.74; 95% CI 0.6-0.88). The log-rank test to compare the observed and predicted OS curves indicated no statistically significant difference between observed and predicted curves. However, the predictions from the SPM based on interim data deviated significantly from the final data. Our study showed that a DES model with properly developed risk equations presents considerable improvements to the more traditional SPM in flexibility and predictive accuracy of long-term outcomes. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Developing R&D Portfolio Business Validity Simulation Model and System

    PubMed Central

    2015-01-01

    The R&D has been recognized as critical method to take competitiveness by not only companies but also nations with its value creation such as patent value and new product. Therefore, R&D has been a decision maker's burden in that it is hard to decide how much money to invest, how long time one should spend, and what technology to develop which means it accompanies resources such as budget, time, and manpower. Although there are diverse researches about R&D evaluation, business factors are not concerned enough because almost all previous studies are technology oriented evaluation with one R&D technology based. In that, we early proposed R&D business aspect evaluation model which consists of nine business model components. In this research, we develop a simulation model and system evaluating a company or industry's R&D portfolio with business model point of view and clarify default and control parameters to facilitate evaluator's business validity work in each evaluation module by integrate to one screen. PMID:25893209

  20. Simulation of a polarized laser beam reflected at the sea surface: modeling and validation

    NASA Astrophysics Data System (ADS)

    Schwenger, Frédéric

    2015-05-01

    A 3-D simulation of the polarization-dependent reflection of a Gaussian shaped laser beam on the dynamic sea surface is presented. The simulation considers polarized or unpolarized laser sources and calculates the polarization states upon reflection at the sea surface. It is suitable for the radiance calculation of the scene in different spectral wavebands (e.g. near-infrared, SWIR, etc.) not including the camera degradations. The simulation also considers a bistatic configuration of laser source and receiver as well as different atmospheric conditions. In the SWIR, the detected total power of reflected laser light is compared with data collected in a field trial. Our computer simulation combines the 3-D simulation of a maritime scene (open sea/clear sky) with the simulation of polarized or unpolarized laser light reflected at the sea surface. The basic sea surface geometry is modeled by a composition of smooth wind driven gravity waves. To predict the input of a camera equipped with a linear polarizer, the polarized sea surface radiance must be calculated for the specific waveband. The s- and p-polarization states are calculated for the emitted sea surface radiance and the specularly reflected sky radiance to determine the total polarized sea surface radiance of each component. The states of polarization and the radiance of laser light specularly reflected at the wind-roughened sea surface are calculated by considering the s- and p- components of the electric field of laser light with respect to the specular plane of incidence. This is done by using the formalism of their coherence matrices according to E. Wolf [1]. Additionally, an analytical statistical sea surface BRDF (bidirectional reflectance distribution function) is considered for the reflection of laser light radiances. Validation of the simulation results is required to ensure model credibility and applicability to maritime laser applications. For validation purposes, field measurement data (images and

  1. Validation of 2D flood models with insurance claims

    NASA Astrophysics Data System (ADS)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  2. Lessons Learned from a Cross-Model Validation between a Discrete Event Simulation Model and a Cohort State-Transition Model for Personalized Breast Cancer Treatment.

    PubMed

    Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Paulden, Mike; Kluibenschädl, Martina; Arvandi, Marjan; Kühne, Felicitas; Goehler, Alexander; Krahn, Murray D; Siebert, Uwe

    2016-04-01

    Breast cancer is the most common malignancy among women in developed countries. We developed a model (the Oncotyrol breast cancer outcomes model) to evaluate the cost-effectiveness of a 21-gene assay when used in combination with Adjuvant! Online to support personalized decisions about the use of adjuvant chemotherapy. The goal of this study was to perform a cross-model validation. The Oncotyrol model evaluates the 21-gene assay by simulating a hypothetical cohort of 50-year-old women over a lifetime horizon using discrete event simulation. Primary model outcomes were life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). We followed the International Society for Pharmacoeconomics and Outcomes Research-Society for Medical Decision Making (ISPOR-SMDM) best practice recommendations for validation and compared modeling results of the Oncotyrol model with the state-transition model developed by the Toronto Health Economics and Technology Assessment (THETA) Collaborative. Both models were populated with Canadian THETA model parameters, and outputs were compared. The differences between the models varied among the different validation end points. The smallest relative differences were in costs, and the greatest were in QALYs. All relative differences were less than 1.2%. The cost-effectiveness plane showed that small differences in the model structure can lead to different sets of nondominated test-treatment strategies with different efficiency frontiers. We faced several challenges: distinguishing between differences in outcomes due to different modeling techniques and initial coding errors, defining meaningful differences, and selecting measures and statistics for comparison (means, distributions, multivariate outcomes). Cross-model validation was crucial to identify and correct coding errors and to explain differences in model outcomes. In our comparison, small differences in either QALYs or costs led to changes in

  3. Validation of Mission Plans Through Simulation

    NASA Astrophysics Data System (ADS)

    St-Pierre, J.; Melanson, P.; Brunet, C.; Crabtree, D.

    2002-01-01

    The purpose of a spacecraft mission planning system is to automatically generate safe and optimized mission plans for a single spacecraft, or more functioning in unison. The system verifies user input syntax, conformance to commanding constraints, absence of duty cycle violations, timing conflicts, state conflicts, etc. Present day constraint-based systems with state-based predictive models use verification rules derived from expert knowledge. A familiar solution found in Mission Operations Centers, is to complement the planning system with a high fidelity spacecraft simulator. Often a dedicated workstation, the simulator is frequently used for operator training and procedure validation, and may be interfaced to actual control stations with command and telemetry links. While there are distinct advantages to having a planning system offer realistic operator training using the actual flight control console, physical verification of data transfer across layers and procedure validation, experience has revealed some drawbacks and inefficiencies in ground segment operations: With these considerations, two simulation-based mission plan validation projects are under way at the Canadian Space Agency (CSA): RVMP and ViSION. The tools proposed in these projects will automatically run scenarios and provide execution reports to operations planning personnel, prior to actual command upload. This can provide an important safeguard for system or human errors that can only be detected with high fidelity, interdependent spacecraft models running concurrently. The core element common to these projects is a spacecraft simulator, built with off-the- shelf components such as CAE's Real-Time Object-Based Simulation Environment (ROSE) technology, MathWork's MATLAB/Simulink, and Analytical Graphics' Satellite Tool Kit (STK). To complement these tools, additional components were developed, such as an emulated Spacecraft Test and Operations Language (STOL) interpreter and CCSDS TM

  4. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  5. The internal validity of arthroscopic simulators and their effectiveness in arthroscopic education.

    PubMed

    Slade Shantz, Jesse Alan; Leiter, Jeff R S; Gottschalk, Tania; MacDonald, Peter Benjamin

    2014-01-01

    The purpose of this systematic review was to identify standard procedures for the validation of arthroscopic simulators and determine whether simulators improve the surgical skills of users. Arthroscopic simulator validation studies and randomized trials assessing the effectiveness of arthroscopic simulators in education were identified from online databases, as well as, grey literature and reference lists. Only validation studies and randomized trials were included for review. Study heterogeneity was calculated and where appropriate, study results were combined employing a random effects model. Four hundred and thirteen studies were reviewed. Thirteen studies met the inclusion criteria assessing the construct validity of simulators. A pooled analysis of internal validation studies determined that simulators could discriminate between novice and experts, but not between novice and intermediate trainees on time of completion of a simulated task. Only one study assessed the utility of a knee simulator in training arthroscopic skills directly and demonstrated that the skill level of simulator-trained residents was greater than non-simulator-trained residents. Excessive heterogeneity exists in the literature to determine the internal and transfer validity of arthroscopic simulators currently available. Evidence suggests that simulators can discriminate between novice and expert users, but discrimination between novice and intermediate trainees in surgical education should be paramount. International standards for the assessment of arthroscopic simulator validity should be developed to increase the use and effectiveness of simulators in orthopedic surgery.

  6. Validation of CT dose-reduction simulation

    PubMed Central

    Massoumzadeh, Parinaz; Don, Steven; Hildebolt, Charles F.; Bae, Kyongtae T.; Whiting, Bruce R.

    2009-01-01

    The objective of this research was to develop and validate a custom computed tomography dose-reduction simulation technique for producing images that have an appearance consistent with the same scan performed at a lower mAs (with fixed kVp, rotation time, and collimation). Synthetic noise is added to projection (sinogram) data, incorporating a stochastic noise model that includes energy-integrating detectors, tube-current modulation, bowtie beam filtering, and electronic system noise. Experimental methods were developed to determine the parameters required for each component of the noise model. As a validation, the outputs of the simulations were compared to measurements with cadavers in the image domain and with phantoms in both the sinogram and image domain, using an unbiased root-mean-square relative error metric to quantify agreement in noise processes. Four-alternative forced-choice (4AFC) observer studies were conducted to confirm the realistic appearance of simulated noise, and the effects of various system model components on visual noise were studied. The “just noticeable difference (JND)” in noise levels was analyzed to determine the sensitivity of observers to changes in noise level. Individual detector measurements were shown to be normally distributed (p>0.54), justifying the use of a Gaussian random noise generator for simulations. Phantom tests showed the ability to match original and simulated noise variance in the sinogram domain to within 5.6%±1.6% (standard deviation), which was then propagated into the image domain with errors less than 4.1%±1.6%. Cadaver measurements indicated that image noise was matched to within 2.6%±2.0%. More importantly, the 4AFC observer studies indicated that the simulated images were realistic, i.e., no detectable difference between simulated and original images (p=0.86) was observed. JND studies indicated that observers’ sensitivity to change in noise levels corresponded to a 25% difference in dose, which is

  7. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  8. Development and validation of a new population-based simulation model of osteoarthritis in New Zealand.

    PubMed

    Wilson, R; Abbott, J H

    2018-04-01

    To describe the construction and preliminary validation of a new population-based microsimulation model developed to analyse the health and economic burden and cost-effectiveness of treatments for knee osteoarthritis (OA) in New Zealand (NZ). We developed the New Zealand Management of Osteoarthritis (NZ-MOA) model, a discrete-time state-transition microsimulation model of the natural history of radiographic knee OA. In this article, we report on the model structure, derivation of input data, validation of baseline model parameters against external data sources, and validation of model outputs by comparison of the predicted population health loss with previous estimates. The NZ-MOA model simulates both the structural progression of radiographic knee OA and the stochastic development of multiple disease symptoms. Input parameters were sourced from NZ population-based data where possible, and from international sources where NZ-specific data were not available. The predicted distributions of structural OA severity and health utility detriments associated with OA were externally validated against other sources of evidence, and uncertainty resulting from key input parameters was quantified. The resulting lifetime and current population health-loss burden was consistent with estimates of previous studies. The new NZ-MOA model provides reliable estimates of the health loss associated with knee OA in the NZ population. The model structure is suitable for analysis of the effects of a range of potential treatments, and will be used in future work to evaluate the cost-effectiveness of recommended interventions within the NZ healthcare system. Copyright © 2018 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  9. Design and validation of a dynamic discrete event stochastic simulation model of mastitis control in dairy herds.

    PubMed

    Allore, H G; Schruben, L W; Erb, H N; Oltenacu, P A

    1998-03-01

    A dynamic stochastic simulation model for discrete events, SIMMAST, was developed to simulate the effect of mastitis on the composition of the bulk tank milk of dairy herds. Intramammary infections caused by Streptococcus agalactiae, Streptococcus spp. other than Strep. agalactiae, Staphylococcus aureus, and coagulase-negative staphylococci were modeled as were the milk, fat, and protein test day solutions for individual cows, which accounted for the fixed effects of days in milk, age at calving, season of calving, somatic cell count (SCC), and random effects of test day, cow yield differences from herdmates, and autocorrelated errors. Probabilities for the transitions among various states of udder health (uninfected or subclinically or clinically infected) were calculated to account for exposure, heifer infection, spontaneous recovery, lactation cure, infection or cure during the dry period, month of lactation, parity, within-herd yields, and the number of quarters with clinical intramammary infection in the previous and current lactations. The stochastic simulation model was constructed using estimates from the literature and also using data from 164 herds enrolled with Quality Milk Promotion Services that each had bulk tank SCC between 500,000 and 750,000/ml. Model parameters and outputs were validated against a separate data file of 69 herds from the Northeast Dairy Herd Improvement Association, each with a bulk tank SCC that was > or = 500,000/ml. Sensitivity analysis was performed on all input parameters for control herds. Using the validated stochastic simulation model, the control herds had a stable time average bulk tank SCC between 500,000 and 750,000/ml.

  10. Validation of the Monte Carlo simulator GATE for indium-111 imaging.

    PubMed

    Assié, K; Gardin, I; Véra, P; Buvat, I

    2005-07-07

    Monte Carlo simulations are useful for optimizing and assessing single photon emission computed tomography (SPECT) protocols, especially when aiming at measuring quantitative parameters from SPECT images. Before Monte Carlo simulated data can be trusted, the simulation model must be validated. The purpose of this work was to validate the use of GATE, a new Monte Carlo simulation platform based on GEANT4, for modelling indium-111 SPECT data, the quantification of which is of foremost importance for dosimetric studies. To that end, acquisitions of (111)In line sources in air and in water and of a cylindrical phantom were performed, together with the corresponding simulations. The simulation model included Monte Carlo modelling of the camera collimator and of a back-compartment accounting for photomultiplier tubes and associated electronics. Energy spectra, spatial resolution, sensitivity values, images and count profiles obtained for experimental and simulated data were compared. An excellent agreement was found between experimental and simulated energy spectra. For source-to-collimator distances varying from 0 to 20 cm, simulated and experimental spatial resolution differed by less than 2% in air, while the simulated sensitivity values were within 4% of the experimental values. The simulation of the cylindrical phantom closely reproduced the experimental data. These results suggest that GATE enables accurate simulation of (111)In SPECT acquisitions.

  11. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  12. Notes on modeling and simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redondo, Antonio

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  13. Modeling and simulation of maintenance treatment in first-line non-small cell lung cancer with external validation.

    PubMed

    Han, Kelong; Claret, Laurent; Sandler, Alan; Das, Asha; Jin, Jin; Bruno, Rene

    2016-07-13

    Maintenance treatment (MTx) in responders following first-line treatment has been investigated and practiced for many cancers. Modeling and simulation may support interpretation of interim data and development decisions. We aimed to develop a modeling framework to simulate overall survival (OS) for MTx in NSCLC using tumor growth inhibition (TGI) data. TGI metrics were estimated using longitudinal tumor size data from two Phase III first-line NSCLC studies evaluating bevacizumab and erlotinib as MTx in 1632 patients. Baseline prognostic factors and TGI metric estimates were assessed in multivariate parametric models to predict OS. The OS model was externally validated by simulating a third independent NSCLC study (n = 253) based on interim TGI data (up to progression-free survival database lock). The third study evaluated pemetrexed + bevacizumab vs. bevacizumab alone as MTx. Time-to-tumor-growth (TTG) was the best TGI metric to predict OS. TTG, baseline tumor size, ECOG score, Asian ethnicity, age, and gender were significant covariates in the final OS model. The OS model was qualified by simulating OS distributions and hazard ratios (HR) in the two studies used for model-building. Simulations of the third independent study based on interim TGI data showed that pemetrexed + bevacizumab MTx was unlikely to significantly prolong OS vs. bevacizumab alone given the current sample size (predicted HR: 0.81; 95 % prediction interval: 0.59-1.09). Predicted median OS was 17.3 months and 14.7 months in both arms, respectively. These simulations are consistent with the results of the final OS analysis published 2 years later (observed HR: 0.87; 95 % confidence interval: 0.63-1.21). Final observed median OS was 17.1 months and 13.2 months in both arms, respectively, consistent with our predictions. A robust TGI-OS model was developed for MTx in NSCLC. TTG captures treatment effect. The model successfully predicted the OS outcomes of an independent study

  14. Validated simulator for space debris removal with nets and other flexible tethers applications

    NASA Astrophysics Data System (ADS)

    Gołębiowski, Wojciech; Michalczyk, Rafał; Dyrek, Michał; Battista, Umberto; Wormnes, Kjetil

    2016-12-01

    In the context of active debris removal technologies and preparation activities for the e.Deorbit mission, a simulator for net-shaped elastic bodies dynamics and their interactions with rigid bodies, has been developed. Its main application is to aid net design and test scenarios for space debris deorbitation. The simulator can model all the phases of the debris capturing process: net launch, flight and wrapping around the target. It handles coupled simulation of rigid and flexible bodies dynamics. Flexible bodies were implemented using Cosserat rods model. It allows to simulate flexible threads or wires with elasticity and damping for stretching, bending and torsion. Threads may be combined into structures of any topology, so the software is able to simulate nets, pure tethers, tether bundles, cages, trusses, etc. Full contact dynamics was implemented. Programmatic interaction with simulation is possible - i.e. for control implementation. The underlying model has been experimentally validated and due to significant gravity influence, experiment had to be performed in microgravity conditions. Validation experiment for parabolic flight was a downscaled process of Envisat capturing. The prepacked net was launched towards the satellite model, it expanded, hit the model and wrapped around it. The whole process was recorded with 2 fast stereographic camera sets for full 3D trajectory reconstruction. The trajectories were used to compare net dynamics to respective simulations and then to validate the simulation tool. The experiments were performed on board of a Falcon-20 aircraft, operated by National Research Council in Ottawa, Canada. Validation results show that model reflects phenomenon physics accurately enough, so it may be used for scenario evaluation and mission design purposes. The functionalities of the simulator are described in detail in the paper, as well as its underlying model, sample cases and methodology behind validation. Results are presented and

  15. Two-Speed Gearbox Dynamic Simulation Predictions and Test Validation

    NASA Technical Reports Server (NTRS)

    Lewicki, David G.; DeSmidt, Hans; Smith, Edward C.; Bauman, Steven W.

    2010-01-01

    Dynamic simulations and experimental validation tests were performed on a two-stage, two-speed gearbox as part of the drive system research activities of the NASA Fundamental Aeronautics Subsonics Rotary Wing Project. The gearbox was driven by two electromagnetic motors and had two electromagnetic, multi-disk clutches to control output speed. A dynamic model of the system was created which included a direct current electric motor with proportional-integral-derivative (PID) speed control, a two-speed gearbox with dual electromagnetically actuated clutches, and an eddy current dynamometer. A six degree-of-freedom model of the gearbox accounted for the system torsional dynamics and included gear, clutch, shaft, and load inertias as well as shaft flexibilities and a dry clutch stick-slip friction model. Experimental validation tests were performed on the gearbox in the NASA Glenn gear noise test facility. Gearbox output speed and torque as well as drive motor speed and current were compared to those from the analytical predictions. The experiments correlate very well with the predictions, thus validating the dynamic simulation methodologies.

  16. Experimental validation of ultrasonic NDE simulation software

    NASA Astrophysics Data System (ADS)

    Dib, Gerges; Larche, Michael; Diaz, Aaron A.; Crawford, Susan L.; Prowant, Matthew S.; Anderson, Michael T.

    2016-02-01

    Computer modeling and simulation is becoming an essential tool for transducer design and insight into ultrasonic nondestructive evaluation (UT-NDE). As the popularity of simulation tools for UT-NDE increases, it becomes important to assess their reliability to model acoustic responses from defects in operating components and provide information that is consistent with in-field inspection data. This includes information about the detectability of different defect types for a given UT probe. Recently, a cooperative program between the Electrical Power Research Institute and the U.S. Nuclear Regulatory Commission was established to validate numerical modeling software commonly used for simulating UT-NDE of nuclear power plant components. In the first phase of this cooperative, extensive experimental UT measurements were conducted on machined notches with varying depth, length, and orientation in stainless steel plates. Then, the notches were modeled in CIVA, a semi-analytical NDE simulation platform developed by the French Commissariat a l'Energie Atomique, and their responses compared with the experimental measurements. Discrepancies between experimental and simulation results are due to either improper inputs to the simulation model, or to incorrect approximations and assumptions in the numerical models. To address the former, a variation study was conducted on the different parameters that are required as inputs for the model, specifically the specimen and transducer properties. Then, the ability of simulations to give accurate predictions regarding the detectability of the different defects was demonstrated. This includes the results in terms of the variations in defect amplitude indications, and the ratios between tip diffracted and specular signal amplitudes.

  17. Validating Semi-analytic Models of High-redshift Galaxy Formation Using Radiation Hydrodynamical Simulations

    NASA Astrophysics Data System (ADS)

    Côté, Benoit; Silvia, Devin W.; O’Shea, Brian W.; Smith, Britton; Wise, John H.

    2018-05-01

    We use a cosmological hydrodynamic simulation calculated with Enzo and the semi-analytic galaxy formation model (SAM) GAMMA to address the chemical evolution of dwarf galaxies in the early universe. The long-term goal of the project is to better understand the origin of metal-poor stars and the formation of dwarf galaxies and the Milky Way halo by cross-validating these theoretical approaches. We combine GAMMA with the merger tree of the most massive galaxy found in the hydrodynamic simulation and compare the star formation rate, the metallicity distribution function (MDF), and the age–metallicity relationship predicted by the two approaches. We found that the SAM can reproduce the global trends of the hydrodynamic simulation. However, there are degeneracies between the model parameters, and more constraints (e.g., star formation efficiency, gas flows) need to be extracted from the simulation to isolate the correct semi-analytic solution. Stochastic processes such as bursty star formation histories and star formation triggered by supernova explosions cannot be reproduced by the current version of GAMMA. Non-uniform mixing in the galaxy’s interstellar medium, coming primarily from self-enrichment by local supernovae, causes a broadening in the MDF that can be emulated in the SAM by convolving its predicted MDF with a Gaussian function having a standard deviation of ∼0.2 dex. We found that the most massive galaxy in the simulation retains nearby 100% of its baryonic mass within its virial radius, which is in agreement with what is needed in GAMMA to reproduce the global trends of the simulation.

  18. Multireader multicase reader studies with binary agreement data: simulation, analysis, validation, and sizing.

    PubMed

    Chen, Weijie; Wunderlich, Adam; Petrick, Nicholas; Gallas, Brandon D

    2014-10-01

    We treat multireader multicase (MRMC) reader studies for which a reader's diagnostic assessment is converted to binary agreement (1: agree with the truth state, 0: disagree with the truth state). We present a mathematical model for simulating binary MRMC data with a desired correlation structure across readers, cases, and two modalities, assuming the expected probability of agreement is equal for the two modalities ([Formula: see text]). This model can be used to validate the coverage probabilities of 95% confidence intervals (of [Formula: see text], [Formula: see text], or [Formula: see text] when [Formula: see text]), validate the type I error of a superiority hypothesis test, and size a noninferiority hypothesis test (which assumes [Formula: see text]). To illustrate the utility of our simulation model, we adapt the Obuchowski-Rockette-Hillis (ORH) method for the analysis of MRMC binary agreement data. Moreover, we use our simulation model to validate the ORH method for binary data and to illustrate sizing in a noninferiority setting. Our software package is publicly available on the Google code project hosting site for use in simulation, analysis, validation, and sizing of MRMC reader studies with binary agreement data.

  19. Analysis procedures and subjective flight results of a simulator validation and cue fidelity experiment

    NASA Technical Reports Server (NTRS)

    Carr, Peter C.; Mckissick, Burnell T.

    1988-01-01

    A joint experiment to investigate simulator validation and cue fidelity was conducted by the Dryden Flight Research Facility of NASA Ames Research Center (Ames-Dryden) and NASA Langley Research Center. The primary objective was to validate the use of a closed-loop pilot-vehicle mathematical model as an analytical tool for optimizing the tradeoff between simulator fidelity requirements and simulator cost. The validation process includes comparing model predictions with simulation and flight test results to evaluate various hypotheses for differences in motion and visual cues and information transfer. A group of five pilots flew air-to-air tracking maneuvers in the Langley differential maneuvering simulator and visual motion simulator and in an F-14 aircraft at Ames-Dryden. The simulators used motion and visual cueing devices including a g-seat, a helmet loader, wide field-of-view horizon, and a motion base platform.

  20. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  1. Development and validation of rear impact computer simulation model of an adult manual transit wheelchair with a seated occupant.

    PubMed

    Salipur, Zdravko; Bertocci, Gina

    2010-01-01

    It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.

  2. Current status of validation for robotic surgery simulators - a systematic review.

    PubMed

    Abboudi, Hamid; Khan, Mohammed S; Aboumarzouk, Omar; Guru, Khurshid A; Challacombe, Ben; Dasgupta, Prokar; Ahmed, Kamran

    2013-02-01

    To analyse studies validating the effectiveness of robotic surgery simulators. The MEDLINE(®), EMBASE(®) and PsycINFO(®) databases were systematically searched until September 2011. References from retrieved articles were reviewed to broaden the search. The simulator name, training tasks, participant level, training duration and evaluation scoring were extracted from each study. We also extracted data on feasibility, validity, cost-effectiveness, reliability and educational impact. We identified 19 studies investigating simulation options in robotic surgery. There are five different robotic surgery simulation platforms available on the market. In all, 11 studies sought opinion and compared performance between two different groups; 'expert' and 'novice'. Experts ranged in experience from 21-2200 robotic cases. The novice groups consisted of participants with no prior experience on a robotic platform and were often medical students or junior doctors. The Mimic dV-Trainer(®), ProMIS(®), SimSurgery Educational Platform(®) (SEP) and Intuitive systems have shown face, content and construct validity. The Robotic Surgical SimulatorTM system has only been face and content validated. All of the simulators except SEP have shown educational impact. Feasibility and cost-effectiveness of simulation systems was not evaluated in any trial. Virtual reality simulators were shown to be effective training tools for junior trainees. Simulation training holds the greatest potential to be used as an adjunct to traditional training methods to equip the next generation of robotic surgeons with the skills required to operate safely. However, current simulation models have only been validated in small studies. There is no evidence to suggest one type of simulator provides more effective training than any other. More research is needed to validate simulated environments further and investigate the effectiveness of animal and cadaveric training in robotic surgery. © 2012 BJU

  3. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  4. Cultural Geography Model Validation

    DTIC Science & Technology

    2010-03-01

    the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S

  5. Microscopic simulation model calibration and validation handbook.

    DOT National Transportation Integrated Search

    2006-01-01

    Microscopic traffic simulation models are widely used in the transportation engineering field. Because of their cost-effectiveness, risk-free nature, and high-speed benefits, areas of use include transportation system design, traffic operations, and ...

  6. Multireader multicase reader studies with binary agreement data: simulation, analysis, validation, and sizing

    PubMed Central

    Chen, Weijie; Wunderlich, Adam; Petrick, Nicholas; Gallas, Brandon D.

    2014-01-01

    Abstract. We treat multireader multicase (MRMC) reader studies for which a reader’s diagnostic assessment is converted to binary agreement (1: agree with the truth state, 0: disagree with the truth state). We present a mathematical model for simulating binary MRMC data with a desired correlation structure across readers, cases, and two modalities, assuming the expected probability of agreement is equal for the two modalities (P1=P2). This model can be used to validate the coverage probabilities of 95% confidence intervals (of P1, P2, or P1−P2 when P1−P2=0), validate the type I error of a superiority hypothesis test, and size a noninferiority hypothesis test (which assumes P1=P2). To illustrate the utility of our simulation model, we adapt the Obuchowski–Rockette–Hillis (ORH) method for the analysis of MRMC binary agreement data. Moreover, we use our simulation model to validate the ORH method for binary data and to illustrate sizing in a noninferiority setting. Our software package is publicly available on the Google code project hosting site for use in simulation, analysis, validation, and sizing of MRMC reader studies with binary agreement data. PMID:26158051

  7. Calibration and validation of rockfall models

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Valagussa, Andrea; Zenoni, Stefania; Crosta, Giovanni B.

    2013-04-01

    Calibrating and validating landslide models is extremely difficult due to the particular characteristic of landslides: limited recurrence in time, relatively low frequency of the events, short durability of post-event traces, poor availability of continuous monitoring data, especially for small landslide and rockfalls. For this reason, most of the rockfall models presented in literature completely lack calibration and validation of the results. In this contribution, we explore different strategies for rockfall model calibration and validation starting from both an historical event and a full-scale field test. The event occurred in 2012 in Courmayeur (Western Alps, Italy), and caused serious damages to quarrying facilities. This event has been studied soon after the occurrence through a field campaign aimed at mapping the blocks arrested along the slope, the shape and location of the detachment area, and the traces of scars associated to impacts of blocks on the slope. The full-scale field test was performed by Geovert Ltd in the Christchurch area (New Zealand) after the 2011 earthquake. During the test, a number of large blocks have been mobilized from the upper part of the slope and filmed with high velocity cameras from different viewpoints. The movies of each released block were analysed to identify the block shape, the propagation path, the location of impacts, the height of the trajectory and the velocity of the block along the path. Both calibration and validation of rockfall models should be based on the optimization of the agreement between the actual trajectories or location of arrested blocks and the simulated ones. A measure that describe this agreement is therefore needed. For calibration purpose, this measure should simple enough to allow trial and error repetitions of the model for parameter optimization. In this contribution we explore different calibration/validation measures: (1) the percentage of simulated blocks arresting within a buffer of the

  8. Development and validation of a piloted simulation of a helicopter and external sling load

    NASA Technical Reports Server (NTRS)

    Shaughnessy, J. D.; Deaux, T. N.; Yenni, K. R.

    1979-01-01

    A generalized, real time, piloted, visual simulation of a single rotor helicopter, suspension system, and external load is described and validated for the full flight envelope of the U.S. Army CH-54 helicopter and cargo container as an example. The mathematical model described uses modified nonlinear classical rotor theory for both the main rotor and tail rotor, nonlinear fuselage aerodynamics, an elastic suspension system, nonlinear load aerodynamics, and a loadground contact model. The implementation of the mathematical model on a large digital computing system is described, and validation of the simulation is discussed. The mathematical model is validated by comparing measured flight data with simulated data, by comparing linearized system matrices, eigenvalues, and eigenvectors with manufacturers' data, and by the subjective comparison of handling characteristics by experienced pilots. A visual landing display system for use in simulation which generates the pilot's forward looking real world display was examined and a special head up, down looking load/landing zone display is described.

  9. Flight code validation simulator

    NASA Astrophysics Data System (ADS)

    Sims, Brent A.

    1996-05-01

    An End-To-End Simulation capability for software development and validation of missile flight software on the actual embedded computer has been developed utilizing a 486 PC, i860 DSP coprocessor, embedded flight computer and custom dual port memory interface hardware. This system allows real-time interrupt driven embedded flight software development and checkout. The flight software runs in a Sandia Digital Airborne Computer and reads and writes actual hardware sensor locations in which Inertial Measurement Unit data resides. The simulator provides six degree of freedom real-time dynamic simulation, accurate real-time discrete sensor data and acts on commands and discretes from the flight computer. This system was utilized in the development and validation of the successful premier flight of the Digital Miniature Attitude Reference System in January of 1995 at the White Sands Missile Range on a two stage attitude controlled sounding rocket.

  10. Simulation of Turbulent Flow Inside and Above Wind Farms: Model Validation and Layout Effects

    NASA Astrophysics Data System (ADS)

    Wu, Yu-Ting; Porté-Agel, Fernando

    2013-02-01

    A recently-developed large-eddy simulation framework is validated and used to investigate turbulent flow within and above wind farms under neutral conditions. Two different layouts are considered, consisting of thirty wind turbines occupying the same total area and arranged in aligned and staggered configurations, respectively. The subgrid-scale (SGS) turbulent stress is parametrized using a tuning-free Lagrangian scale-dependent dynamic SGS model. The turbine-induced forces are modelled using two types of actuator-disk models: (a) the `standard' actuator-disk model (ADM-NR), which calculates only the thrust force based on one-dimensional momentum theory and distributes it uniformly over the rotor area; and (b) the actuator-disk model with rotation (ADM-R), which uses blade-element momentum theory to calculate the lift and drag forces (that produce both thrust and rotation), and distributes them over the rotor disk based on the local blade and flow characteristics. Validation is performed by comparing simulation results with turbulence measurements collected with hot-wire anemometry inside and above an aligned model wind farm placed in a boundary-layer wind tunnel. In general, the ADM-R model yields improved predictions compared with the ADM-NR in the wakes of all the wind turbines, where including turbine-induced flow rotation and accounting for the non-uniformity of the turbine-induced forces in the ADM-R appear to be important. Another advantage of the ADM-R model is that, unlike the ADM-NR, it does not require a priori specification of the thrust coefficient (which varies within a wind farm). Finally, comparison of simulations of flow through both aligned and staggered wind farms shows important effects of farm layout on the flow structure and wind-turbine performance. For the limited-size wind farms considered in this study, the lateral interaction between cumulated wakes is stronger in the staggered case, which results in a farm wake that is more homogeneous

  11. Large Scale Simulation Platform for NODES Validation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sotorrio, P.; Qin, Y.; Min, L.

    2017-04-27

    This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and lightmore » commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.« less

  12. An Integrated Approach to Conversion, Verification, Validation and Integrity of AFRL Generic Engine Model and Simulation (Postprint)

    DTIC Science & Technology

    2007-02-01

    and Astronautics 11 PS3C W3 P3 T3 FAR3 Ps3 W41 P41 T41 FAR41 Ps41 W4 P4 T4 FAR4 Ps4 7 NozFlow 6 Flow45 5 Flow44 4 Flow41 3 Flow4 2 Flow3 1 N2Bal... Motivation for Modeling and Simulation Work The Augmented Generic Engine Model (AGEM) Model Verification and Validation (V&V) Assessment of AGEM V&V

  13. Validation of Solar Sail Simulations for the NASA Solar Sail Demonstration Project

    NASA Technical Reports Server (NTRS)

    Braafladt, Alexander C.; Artusio-Glimpse, Alexandra B.; Heaton, Andrew F.

    2014-01-01

    NASA's Solar Sail Demonstration project partner L'Garde is currently assembling a flight-like sail assembly for a series of ground demonstration tests beginning in 2015. For future missions of this sail that might validate solar sail technology, it is necessary to have an accurate sail thrust model. One of the primary requirements of a proposed potential technology validation mission will be to demonstrate solar sail thrust over a set time period, which for this project is nominally 30 days. This requirement would be met by comparing a L'Garde-developed trajectory simulation to the as-flown trajectory. The current sail simulation baseline for L'Garde is a Systems Tool Kit (STK) plug-in that includes a custom-designed model of the L'Garde sail. The STK simulation has been verified for a flat plate model by comparing it to the NASA-developed Solar Sail Spaceflight Simulation Software (S5). S5 matched STK with a high degree of accuracy and the results of the validation indicate that the L'Garde STK model is accurate enough to meet the potential future mission requirements. Additionally, since the L'Garde sail deviates considerably from a flat plate, a force model for a non-flat sail provided by L'Garde sail was also tested and compared to a flat plate model in S5. This result will be used in the future as a basis of comparison to the non-flat sail model being developed for STK.

  14. Validation and Simulation of Ares I Scale Model Acoustic Test - 3 - Modeling and Evaluating the Effect of Rainbird Water Deluge Inclusion

    NASA Technical Reports Server (NTRS)

    Strutzenberg, Louise L.; Putman, Gabriel C.

    2011-01-01

    The Ares I Scale Model Acoustics Test (ASMAT) is a series of live-fire tests of scaled rocket motors meant to simulate the conditions of the Ares I launch configuration. These tests have provided a well documented set of high fidelity measurements useful for validation including data taken over a range of test conditions and containing phenomena like Ignition Over-Pressure and water suppression of acoustics. Building on dry simulations of the ASMAT tests with the vehicle at 5 ft. elevation (100 ft. real vehicle elevation), wet simulations of the ASMAT test setup have been performed using the Loci/CHEM computational fluid dynamics software to explore the effect of rainbird water suppression inclusion on the launch platform deck. Two-phase water simulation has been performed using an energy and mass coupled lagrangian particle system module where liquid phase emissions are segregated into clouds of virtual particles and gas phase mass transfer is accomplished through simple Weber number controlled breakup and boiling models. Comparisons have been performed to the dry 5 ft. elevation cases, using configurations with and without launch mounts. These cases have been used to explore the interaction between rainbird spray patterns and launch mount geometry and evaluate the acoustic sound pressure level knockdown achieved through above-deck rainbird deluge inclusion. This comparison has been anchored with validation from live-fire test data which showed a reduction in rainbird effectiveness with the presence of a launch mount.

  15. Validation of Broadband Ground Motion Simulations for Japanese Crustal Earthquakes by the Recipe

    NASA Astrophysics Data System (ADS)

    Iwaki, A.; Maeda, T.; Morikawa, N.; Miyake, H.; Fujiwara, H.

    2015-12-01

    The Headquarters for Earthquake Research Promotion (HERP) of Japan has organized the broadband ground motion simulation method into a standard procedure called the "recipe" (HERP, 2009). In the recipe, the source rupture is represented by the characterized source model (Irikura and Miyake, 2011). The broadband ground motion time histories are computed by a hybrid approach: the 3-D finite-difference method (Aoi et al. 2004) and the stochastic Green's function method (Dan and Sato, 1998; Dan et al. 2000) for the long- (> 1 s) and short-period (< 1 s) components, respectively, using the 3-D velocity structure model. As the engineering significance of scenario earthquake ground motion prediction is increasing, thorough verification and validation are required for the simulation methods. This study presents the self-validation of the recipe for two MW6.6 crustal events in Japan, the 2000 Tottori and 2004 Chuetsu (Niigata) earthquakes. We first compare the simulated velocity time series with the observation. Main features of the velocity waveforms, such as the near-fault pulses and the large later phases on deep sediment sites are well reproduced by the simulations. Then we evaluate 5% damped pseudo acceleration spectra (PSA) in the framework of the SCEC Broadband Platform (BBP) validation (Dreger et al. 2015). The validation results are generally acceptable in the period range 0.1 - 10 s, whereas those in the shortest period range (0.01-0.1 s) are less satisfactory. We also evaluate the simulations with the 1-D velocity structure models used in the SCEC BBP validation exercise. Although the goodness-of-fit parameters for PSA do not significantly differ from those for the 3-D velocity structure model, noticeable differences in velocity waveforms are observed. Our results suggest the importance of 1) well-constrained 3-D velocity structure model for broadband ground motion simulations and 2) evaluation of time series of ground motion as well as response spectra.

  16. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    NASA Astrophysics Data System (ADS)

    Hamid, AHA.; Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A.

    2015-04-01

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder's intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties' absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.

  17. Validation: Codes to compare simulation data to various observations

    NASA Astrophysics Data System (ADS)

    Cohn, J. D.

    2017-02-01

    Validation provides codes to compare several observations to simulated data with stellar mass and star formation rate, simulated data stellar mass function with observed stellar mass function from PRIMUS or SDSS-GALEX in several redshift bins from 0.01-1.0, and simulated data B band luminosity function with observed stellar mass function, and to create plots for various attributes, including stellar mass functions, and stellar mass to halo mass. These codes can model predictions (in some cases alongside observational data) to test other mock catalogs.

  18. Development and validation of a modified Hybrid-III six-year-old dummy model for simulating submarining in motor-vehicle crashes.

    PubMed

    Hu, Jingwen; Klinich, Kathleen D; Reed, Matthew P; Kokkolaras, Michael; Rupp, Jonathan D

    2012-06-01

    In motor-vehicle crashes, young school-aged children restrained by vehicle seat belt systems often suffer from abdominal injuries due to submarining. However, the current anthropomorphic test device, so-called "crash dummy", is not adequate for proper simulation of submarining. In this study, a modified Hybrid-III six-year-old dummy model capable of simulating and predicting submarining was developed using MADYMO (TNO Automotive Safety Solutions). The model incorporated improved pelvis and abdomen geometry and properties previously tested in a modified physical dummy. The model was calibrated and validated against four sled tests under two test conditions with and without submarining using a multi-objective optimization method. A sensitivity analysis using this validated child dummy model showed that dummy knee excursion, torso rotation angle, and the difference between head and knee excursions were good predictors for submarining status. It was also shown that restraint system design variables, such as lap belt angle, D-ring height, and seat coefficient of friction (COF), may have opposite effects on head and abdomen injury risks; therefore child dummies and dummy models capable of simulating submarining are crucial for future restraint system design optimization for young school-aged children. Copyright © 2011 IPEM. Published by Elsevier Ltd. All rights reserved.

  19. Multiphysics Simulation of Welding-Arc and Nozzle-Arc System: Mathematical-Model, Solution-Methodology and Validation

    NASA Astrophysics Data System (ADS)

    Pawar, Sumedh; Sharma, Atul

    2018-01-01

    This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.

  20. Development and validation of the Simulation Learning Effectiveness Inventory.

    PubMed

    Chen, Shiah-Lian; Huang, Tsai-Wei; Liao, I-Chen; Liu, Chienchi

    2015-10-01

    To develop and psychometrically test the Simulation Learning Effectiveness Inventory. High-fidelity simulation helps students develop clinical skills and competencies. Yet, reliable instruments measuring learning outcomes are scant. A descriptive cross-sectional survey was used to validate psychometric properties of the instrument measuring students' perception of stimulation learning effectiveness. A purposive sample of 505 nursing students who had taken simulation courses was recruited from a department of nursing of a university in central Taiwan from January 2010-June 2010. The study was conducted in two phases. In Phase I, question items were developed based on the literature review and the preliminary psychometric properties of the inventory were evaluated using exploratory factor analysis. Phase II was conducted to evaluate the reliability and validity of the finalized inventory using confirmatory factor analysis. The results of exploratory and confirmatory factor analyses revealed the instrument was composed of seven factors, named course arrangement, equipment resource, debriefing, clinical ability, problem-solving, confidence and collaboration. A further second-order analysis showed comparable fits between a three second-order factor (preparation, process and outcome) and the seven first-order factor models. Internal consistency was supported by adequate Cronbach's alphas and composite reliability. Convergent and discriminant validities were also supported by confirmatory factor analysis. The study provides evidence that the Simulation Learning Effectiveness Inventory is reliable and valid for measuring student perception of learning effectiveness. The instrument is helpful in building the evidence-based knowledge of the effect of simulation teaching on students' learning outcomes. © 2015 John Wiley & Sons Ltd.

  1. Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics.

    PubMed

    Garfjeld Roberts, Patrick; Guyver, Paul; Baldwin, Mathew; Akhtar, Kash; Alvand, Abtin; Price, Andrew J; Rees, Jonathan L

    2017-02-01

    To assess the construct and face validity of ArthroS, a passive haptic VR simulator. A secondary aim was to evaluate the novel performance metrics produced by this simulator. Two groups of 30 participants, each divided into novice, intermediate or expert based on arthroscopic experience, completed three separate tasks on either the knee or shoulder module of the simulator. Performance was recorded using 12 automatically generated performance metrics and video footage of the arthroscopic procedures. The videos were blindly assessed using a validated global rating scale (GRS). Participants completed a survey about the simulator's realism and training utility. This new simulator demonstrated construct validity of its tasks when evaluated against a GRS (p ≤ 0.003 in all cases). Regarding it's automatically generated performance metrics, established outputs such as time taken (p ≤ 0.001) and instrument path length (p ≤ 0.007) also demonstrated good construct validity. However, two-thirds of the proposed 'novel metrics' the simulator reports could not distinguish participants based on arthroscopic experience. Face validity assessment rated the simulator as a realistic and useful tool for trainees, but the passive haptic feedback (a key feature of this simulator) is rated as less realistic. The ArthroS simulator has good task construct validity based on established objective outputs, but some of the novel performance metrics could not distinguish between surgical experience. The passive haptic feedback of the simulator also needs improvement. If simulators could offer automated and validated performance feedback, this would facilitate improvements in the delivery of training by allowing trainees to practise and self-assess.

  2. Development of a Conservative Model Validation Approach for Reliable Analysis

    DTIC Science & Technology

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  3. Partial Validation of Multibody Program to Optimize Simulated Trajectories II (POST II) Parachute Simulation With Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Ben; Queen, Eric M.

    2002-01-01

    A capability to simulate trajectories Of Multiple interacting rigid bodies has been developed. This capability uses the Program to Optimize Simulated Trajectories II (POST II). Previously, POST II had the ability to simulate multiple bodies without interacting forces. The current implementation is used for the Simulation of parachute trajectories, in which the parachute and suspended bodies can be treated as rigid bodies. An arbitrary set of connecting lines can be included in the model and are treated as massless spring-dampers. This paper discusses details of the connection line modeling and results of several test cases used to validate the capability.

  4. Model Validation Against The Modelers’ Data Archive

    DTIC Science & Technology

    2014-08-01

    completion of the planned Jack Rabbit 2 field trials. The relevant task for the effort addressed here is Task 4 of the current Interagency Agreement, as...readily simulates the Prairie Grass sulfur dioxide plumes. Also, Jack Rabbit II field trials are set to be completed during FY16. Once these data are...available, they will also be used to validate the combined models. This validation may prove to be more useful, as the Jack Rabbit II will release

  5. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2016-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.

  6. Competency-Based Training and Simulation: Making a "Valid" Argument.

    PubMed

    Noureldin, Yasser A; Lee, Jason Y; McDougall, Elspeth M; Sweet, Robert M

    2018-02-01

    The use of simulation as an assessment tool is much more controversial than is its utility as an educational tool. However, without valid simulation-based assessment tools, the ability to objectively assess technical skill competencies in a competency-based medical education framework will remain challenging. The current literature in urologic simulation-based training and assessment uses a definition and framework of validity that is now outdated. This is probably due to the absence of awareness rather than an absence of comprehension. The following review article provides the urologic community an updated taxonomy on validity theory as it relates to simulation-based training and assessments and translates our simulation literature to date into this framework. While the old taxonomy considered validity as distinct subcategories and focused on the simulator itself, the modern taxonomy, for which we translate the literature evidence, considers validity as a unitary construct with a focus on interpretation of simulator data/scores.

  7. Validation of a Monte Carlo simulation of the Inveon PET scanner using GATE

    NASA Astrophysics Data System (ADS)

    Lu, Lijun; Zhang, Houjin; Bian, Zhaoying; Ma, Jianhua; Feng, Qiangjin; Chen, Wufan

    2016-08-01

    The purpose of this study is to validate the application of GATE (Geant4 Application for Tomographic Emission) Monte Carlo simulation toolkit in order to model the performance characteristics of Siemens Inveon small animal PET system. The simulation results were validated against experimental/published data in accordance with the NEMA NU-4 2008 protocol for standardized evaluation of spatial resolution, sensitivity, scatter fraction (SF) and noise equivalent counting rate (NECR) of a preclinical PET system. An agreement of less than 18% was obtained between the radial, tangential and axial spatial resolutions of the simulated and experimental results. The simulated peak NECR of mouse-size phantom agreed with the experimental result, while for the rat-size phantom simulated value was higher than experimental result. The simulated and experimental SFs of mouse- and rat- size phantom both reached an agreement of less than 2%. It has been shown the feasibility of our GATE model to accurately simulate, within certain limits, all major performance characteristics of Inveon PET system.

  8. I-15 San Diego, California, model validation and calibration report.

    DOT National Transportation Integrated Search

    2010-02-01

    The Integrated Corridor Management (ICM) initiative requires the calibration and validation of simulation models used in the Analysis, Modeling, and Simulation of Pioneer Site proposed integrated corridors. This report summarizes the results and proc...

  9. Validation of a model to investigate the effects of modifying cardiovascular disease (CVD) risk factors on the burden of CVD: the rotterdam ischemic heart disease and stroke computer simulation (RISC) model.

    PubMed

    van Kempen, Bob J H; Ferket, Bart S; Hofman, Albert; Steyerberg, Ewout W; Colkesen, Ersen B; Boekholdt, S Matthijs; Wareham, Nicholas J; Khaw, Kay-Tee; Hunink, M G Myriam

    2012-12-06

    We developed a Monte Carlo Markov model designed to investigate the effects of modifying cardiovascular disease (CVD) risk factors on the burden of CVD. Internal, predictive, and external validity of the model have not yet been established. The Rotterdam Ischemic Heart Disease and Stroke Computer Simulation (RISC) model was developed using data covering 5 years of follow-up from the Rotterdam Study. To prove 1) internal and 2) predictive validity, the incidences of coronary heart disease (CHD), stroke, CVD death, and non-CVD death simulated by the model over a 13-year period were compared with those recorded for 3,478 participants in the Rotterdam Study with at least 13 years of follow-up. 3) External validity was verified using 10 years of follow-up data from the European Prospective Investigation of Cancer (EPIC)-Norfolk study of 25,492 participants, for whom CVD and non-CVD mortality was compared. At year 5, the observed incidences (with simulated incidences in brackets) of CHD, stroke, and CVD and non-CVD mortality for the 3,478 Rotterdam Study participants were 5.30% (4.68%), 3.60% (3.23%), 4.70% (4.80%), and 7.50% (7.96%), respectively. At year 13, these percentages were 10.60% (10.91%), 9.90% (9.13%), 14.20% (15.12%), and 24.30% (23.42%). After recalibrating the model for the EPIC-Norfolk population, the 10-year observed (simulated) incidences of CVD and non-CVD mortality were 3.70% (4.95%) and 6.50% (6.29%). All observed incidences fell well within the 95% credibility intervals of the simulated incidences. We have confirmed the internal, predictive, and external validity of the RISC model. These findings provide a basis for analyzing the effects of modifying cardiovascular disease risk factors on the burden of CVD with the RISC model.

  10. Material model validation for laser shock peening process simulation

    NASA Astrophysics Data System (ADS)

    Amarchinta, H. K.; Grandhi, R. V.; Langer, K.; Stargel, D. S.

    2009-01-01

    Advanced mechanical surface enhancement techniques have been used successfully to increase the fatigue life of metallic components. These techniques impart deep compressive residual stresses into the component to counter potentially damage-inducing tensile stresses generated under service loading. Laser shock peening (LSP) is an advanced mechanical surface enhancement technique used predominantly in the aircraft industry. To reduce costs and make the technique available on a large-scale basis for industrial applications, simulation of the LSP process is required. Accurate simulation of the LSP process is a challenging task, because the process has many parameters such as laser spot size, pressure profile and material model that must be precisely determined. This work focuses on investigating the appropriate material model that could be used in simulation and design. In the LSP process material is subjected to strain rates of 106 s-1, which is very high compared with conventional strain rates. The importance of an accurate material model increases because the material behaves significantly different at such high strain rates. This work investigates the effect of multiple nonlinear material models for representing the elastic-plastic behavior of materials. Elastic perfectly plastic, Johnson-Cook and Zerilli-Armstrong models are used, and the performance of each model is compared with available experimental results.

  11. CFD Simulation and Experimental Validation of Fluid Flow and Particle Transport in a Model of Alveolated Airways

    PubMed Central

    Ma, Baoshun; Ruwet, Vincent; Corieri, Patricia; Theunissen, Raf; Riethmuller, Michel; Darquenne, Chantal

    2009-01-01

    Accurate modeling of air flow and aerosol transport in the alveolated airways is essential for quantitative predictions of pulmonary aerosol deposition. However, experimental validation of such modeling studies has been scarce. The objective of this study is to validate CFD predictions of flow field and particle trajectory with experiments within a scaled-up model of alveolated airways. Steady flow (Re = 0.13) of silicone oil was captured by particle image velocimetry (PIV), and the trajectories of 0.5 mm and 1.2 mm spherical iron beads (representing 0.7 to 14.6 μm aerosol in vivo) were obtained by particle tracking velocimetry (PTV). At twelve selected cross sections, the velocity profiles obtained by CFD matched well with those by PIV (within 1.7% on average). The CFD predicted trajectories also matched well with PTV experiments. These results showed that air flow and aerosol transport in models of human alveolated airways can be simulated by CFD techniques with reasonable accuracy. PMID:20161301

  12. CFD Simulation and Experimental Validation of Fluid Flow and Particle Transport in a Model of Alveolated Airways.

    PubMed

    Ma, Baoshun; Ruwet, Vincent; Corieri, Patricia; Theunissen, Raf; Riethmuller, Michel; Darquenne, Chantal

    2009-05-01

    Accurate modeling of air flow and aerosol transport in the alveolated airways is essential for quantitative predictions of pulmonary aerosol deposition. However, experimental validation of such modeling studies has been scarce. The objective of this study is to validate CFD predictions of flow field and particle trajectory with experiments within a scaled-up model of alveolated airways. Steady flow (Re = 0.13) of silicone oil was captured by particle image velocimetry (PIV), and the trajectories of 0.5 mm and 1.2 mm spherical iron beads (representing 0.7 to 14.6 mum aerosol in vivo) were obtained by particle tracking velocimetry (PTV). At twelve selected cross sections, the velocity profiles obtained by CFD matched well with those by PIV (within 1.7% on average). The CFD predicted trajectories also matched well with PTV experiments. These results showed that air flow and aerosol transport in models of human alveolated airways can be simulated by CFD techniques with reasonable accuracy.

  13. Assessing Discriminative Performance at External Validation of Clinical Prediction Models

    PubMed Central

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W.

    2016-01-01

    Introduction External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. Methods We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. Results The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. Conclusion The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect

  14. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    PubMed

    Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W

    2016-01-01

    External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  15. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamid, AHA., E-mail: amyhamijah@nm.gov.my; Faculty of Computing, Universiti Teknologi Malaysia; Rozan, MZA.

    2015-04-29

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation wasmore » carried on to endorse the correctness of the model itself against the stakeholder’s intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties’ absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.« less

  16. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  17. Use of midlatitude soil moisture and meteorological observations to validate soil moisture simulations with biosphere and bucket models

    NASA Technical Reports Server (NTRS)

    Robock, Alan; Vinnikov, Konstantin YA.; Schlosser, C. Adam; Speranskaya, Nina A.; Xue, Yongkang

    1995-01-01

    Soil moisture observations in sites with natural vegetation were made for several decades in the former Soviet Union at hundreds of stations. In this paper, the authors use data from six of these stations from different climatic regimes, along with ancillary meteorological and actinometric data, to demonstrate a method to validate soil moisture simulations with biosphere and bucket models. Some early and current general circulation models (GCMs) use bucket models for soil hydrology calculations. More recently, the Simple Biosphere Model (SiB) was developed to incorporate the effects of vegetation on fluxes of moisture, momentum, and energy at the earth's surface into soil hydrology models. Until now, the bucket and SiB have been verified by comparison with actual soil moisture data only on a limited basis. In this study, a Simplified SiB (SSiB) soil hydrology model and a 15-cm bucket model are forced by observed meteorological and actinometric data every 3 h for 6-yr simulations at the six stations. The model calculations of soil moisture are compared to observations of soil moisture, literally 'ground truth,' snow cover, surface albedo, and net radiation, and with each other. For three of the stations, the SSiB and 15-cm bucket models produce good simulations of seasonal cycles and interannual variations of soil moisture. For the other three stations, there are large errors in the simulations by both models. Inconsistencies in specification of field capacity may be partly responsible. There is no evidence that the SSiB simulations are superior in simulating soil moisture variations. In fact, the models are quite similar since SSiB implicitly has a bucket embedded in it. One of the main differences between the models is in the treatment of runoff due to melting snow in the spring -- SSiB incorrectly puts all the snowmelt into runoff. While producing similar soil moisture simulations, the models produce very different surface latent and sensible heat fluxes, which

  18. Piloted Evaluation of a UH-60 Mixer Equivalent Turbulence Simulation Model

    NASA Technical Reports Server (NTRS)

    Lusardi, Jeff A.; Blanken, Chris L.; Tischeler, Mark B.

    2002-01-01

    A simulation study of a recently developed hover/low speed Mixer Equivalent Turbulence Simulation (METS) model for the UH-60 Black Hawk helicopter was conducted in the NASA Ames Research Center Vertical Motion Simulator (VMS). The experiment was a continuation of previous work to develop a simple, but validated, turbulence model for hovering rotorcraft. To validate the METS model, two experienced test pilots replicated precision hover tasks that had been conducted in an instrumented UH-60 helicopter in turbulence. Objective simulation data were collected for comparison with flight test data, and subjective data were collected that included handling qualities ratings and pilot comments for increasing levels of turbulence. Analyses of the simulation results show good analytic agreement between the METS model and flight test data, with favorable pilot perception of the simulated turbulence. Precision hover tasks were also repeated using the more complex rotating-frame SORBET (Simulation Of Rotor Blade Element Turbulence) model to generate turbulence. Comparisons of the empirically derived METS model with the theoretical SORBET model show good agreement providing validation of the more complex blade element method of simulating turbulence.

  19. Methodologies for validating ray-based forward model using finite element method in ultrasonic array data simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Nixon, Andrew; Barber, Tom; Budyn, Nicolas; Bevan, Rhodri; Croxford, Anthony; Wilcox, Paul

    2018-04-01

    In this paper, a methodology of using finite element (FE) model to validate a ray-based model in the simulation of full matrix capture (FMC) ultrasonic array data set is proposed. The overall aim is to separate signal contributions from different interactions in FE results for easier comparing each individual component in the ray-based model results. This is achieved by combining the results from multiple FE models of the system of interest that include progressively more geometrical features while preserving the same mesh structure. It is shown that the proposed techniques allow the interactions from a large number of different ray-paths to be isolated in FE results and compared directly to the results from a ray-based forward model.

  20. Development and validation of a laparoscopic hysterectomy cuff closure simulation model for surgical training.

    PubMed

    Tunitsky-Bitton, Elena; Propst, Katie; Muffly, Tyler

    2016-03-01

    The number of robotically assisted hysterectomies is increasing, and therefore, the opportunities for trainees to become competent in performing traditional laparoscopic hysterectomy are decreasing. Simulation-based training is ideal for filling this gap in training. The objective of the study was to design a surgical model for training in laparoscopic vaginal cuff closure and to present evidence of its validity and reliability as an assessment and training tool. Participants included gynecology staff and trainees at 2 tertiary care centers. Experienced surgeons were also recruited at the combined International Urogynecologic Association and American Urogynecologic Society scientific meeting. Participants included 19 experts and 21 trainees. All participants were recorded using the laparoscopic hysterectomy cuff closure simulation model. The model was constructed using the an advanced uterine manipulation system with a sacrocolopexy tip/vaginal stent, a vaginal cuff constructed from neoprene material and lined with a swimsuit material (nylon and spandex) secured to the vaginal stent with a plastic cable tie. The uterine manipulation system was attached to the fundamentals of laparoscopic surgery laparoscopic training box trainer using a metal bracket. Performance was evaluated using the Global Operative Assessment of Laparoscopic Skills scale. In addition, needle handling, knot tying, and incorporation of epithelial edge were also evaluated. The Student t test was used to compare the scores and the operating times between the groups. Intrarater reliability between the scores by the 2 masked experts was measured using the interclass correlation coefficient. Total and annual experience with laparoscopic suturing and specifically vaginal cuff closure varied greatly among the participants. For the construct validity, the participants in the expert group received significantly higher scores in each of the domains of the Global Operative Assessment of Laparoscopic Skills

  1. Monte Carlo modeling and simulations of the High Definition (HD120) micro MLC and validation against measurements for a 6 MV beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borges, C.; Zarza-Moreno, M.; Heath, E.

    2012-01-15

    Purpose: The most recent Varian micro multileaf collimator (MLC), the High Definition (HD120) MLC, was modeled using the BEAMNRC Monte Carlo code. This model was incorporated into a Varian medical linear accelerator, for a 6 MV beam, in static and dynamic mode. The model was validated by comparing simulated profiles with measurements. Methods: The Varian Trilogy (2300C/D) accelerator model was accurately implemented using the state-of-the-art Monte Carlo simulation program BEAMNRC and validated against off-axis and depth dose profiles measured using ionization chambers, by adjusting the energy and the full width at half maximum (FWHM) of the initial electron beam. Themore » HD120 MLC was modeled by developing a new BEAMNRC component module (CM), designated HDMLC, adapting the available DYNVMLC CM and incorporating the specific characteristics of this new micro MLC. The leaf dimensions were provided by the manufacturer. The geometry was visualized by tracing particles through the CM and recording their position when a leaf boundary is crossed. The leaf material density and abutting air gap between leaves were adjusted in order to obtain a good agreement between the simulated leakage profiles and EBT2 film measurements performed in a solid water phantom. To validate the HDMLC implementation, additional MLC static patterns were also simulated and compared to additional measurements. Furthermore, the ability to simulate dynamic MLC fields was implemented in the HDMLC CM. The simulation results of these fields were compared with EBT2 film measurements performed in a solid water phantom. Results: Overall, the discrepancies, with and without MLC, between the opened field simulations and the measurements using ionization chambers in a water phantom, for the off-axis profiles are below 2% and in depth-dose profiles are below 2% after the maximum dose depth and below 4% in the build-up region. On the conditions of these simulations, this tungsten-based MLC has a density of

  2. External validation of type 2 diabetes computer simulation models: definitions, approaches, implications and room for improvement-a protocol for a systematic review.

    PubMed

    Ogurtsova, Katherine; Heise, Thomas L; Linnenkamp, Ute; Dintsios, Charalabos-Markos; Lhachimi, Stefan K; Icks, Andrea

    2017-12-29

    Type 2 diabetes mellitus (T2DM), a highly prevalent chronic disease, puts a large burden on individual health and health care systems. Computer simulation models, used to evaluate the clinical and economic effectiveness of various interventions to handle T2DM, have become a well-established tool in diabetes research. Despite the broad consensus about the general importance of validation, especially external validation, as a crucial instrument of assessing and controlling for the quality of these models, there are no systematic reviews comparing such validation of diabetes models. As a result, the main objectives of this systematic review are to identify and appraise the different approaches used for the external validation of existing models covering the development and progression of T2DM. We will perform adapted searches by applying respective search strategies to identify suitable studies from 14 electronic databases. Retrieved study records will be included or excluded based on predefined eligibility criteria as defined in this protocol. Among others, a publication filter will exclude studies published before 1995. We will run abstract and full text screenings and then extract data from all selected studies by filling in a predefined data extraction spreadsheet. We will undertake a descriptive, narrative synthesis of findings to address the study objectives. We will pay special attention to aspects of quality of these models in regard to the external validation based upon ISPOR and ADA recommendations as well as Mount Hood Challenge reports. All critical stages within the screening, data extraction and synthesis processes will be conducted by at least two authors. This protocol adheres to PRISMA and PRISMA-P standards. The proposed systematic review will provide a broad overview of the current practice in the external validation of models with respect to T2DM incidence and progression in humans built on simulation techniques. PROSPERO CRD42017069983 .

  3. Modelling and simulation of a heat exchanger

    NASA Technical Reports Server (NTRS)

    Xia, Lei; Deabreu-Garcia, J. Alex; Hartley, Tom T.

    1991-01-01

    Two models for two different control systems are developed for a parallel heat exchanger. First by spatially lumping a heat exchanger model, a good approximate model which has a high system order is produced. Model reduction techniques are applied to these to obtain low order models that are suitable for dynamic analysis and control design. The simulation method is discussed to ensure a valid simulation result.

  4. Validation of simulated earthquake ground motions based on evolution of intensity and frequency content

    USGS Publications Warehouse

    Rezaeian, Sanaz; Zhong, Peng; Hartzell, Stephen; Zareian, Farzin

    2015-01-01

    Simulated earthquake ground motions can be used in many recent engineering applications that require time series as input excitations. However, applicability and validation of simulations are subjects of debate in the seismological and engineering communities. We propose a validation methodology at the waveform level and directly based on characteristics that are expected to influence most structural and geotechnical response parameters. In particular, three time-dependent validation metrics are used to evaluate the evolving intensity, frequency, and bandwidth of a waveform. These validation metrics capture nonstationarities in intensity and frequency content of waveforms, making them ideal to address nonlinear response of structural systems. A two-component error vector is proposed to quantify the average and shape differences between these validation metrics for a simulated and recorded ground-motion pair. Because these metrics are directly related to the waveform characteristics, they provide easily interpretable feedback to seismologists for modifying their ground-motion simulation models. To further simplify the use and interpretation of these metrics for engineers, it is shown how six scalar key parameters, including duration, intensity, and predominant frequency, can be extracted from the validation metrics. The proposed validation methodology is a step forward in paving the road for utilization of simulated ground motions in engineering practice and is demonstrated using examples of recorded and simulated ground motions from the 1994 Northridge, California, earthquake.

  5. Empirical validation of an agent-based model of wood markets in Switzerland

    PubMed Central

    Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver

    2018-01-01

    We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300

  6. Finite Element Model and Validation of Nasal Tip Deformation

    PubMed Central

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian JF

    2016-01-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39mm ± 1.04 mm and deviated up to 2mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow. PMID:27633018

  7. Finite Element Model and Validation of Nasal Tip Deformation.

    PubMed

    Manuel, Cyrus T; Harb, Rani; Badran, Alan; Ho, David; Wong, Brian J F

    2017-03-01

    Nasal tip mechanical stability is important for functional and cosmetic nasal airway surgery. Palpation of the nasal tip provides information on tip strength to the surgeon, though it is a purely subjective assessment. Providing a means to simulate nasal tip deformation with a validated model can offer a more objective approach in understanding the mechanics and nuances of the nasal tip support and eventual nasal mechanics as a whole. Herein we present validation of a finite element (FE) model of the nose using physical measurements recorded using an ABS plastic-silicone nasal phantom. Three-dimensional photogrammetry was used to capture the geometry of the phantom at rest and while under steady state load. The silicone used to make the phantom was mechanically tested and characterized using a linear elastic constitutive model. Surface point clouds of the silicone and FE model were compared for both the loaded and unloaded state. The average Hausdorff distance between actual measurements and FE simulations across the nose were 0.39 ± 1.04 mm and deviated up to 2 mm at the outermost boundaries of the model. FE simulation and measurements were in near complete agreement in the immediate vicinity of the nasal tip with millimeter accuracy. We have demonstrated validation of a two-component nasal FE model, which could be used to model more complex modes of deformation where direct measurement may be challenging. This is the first step in developing a nasal model to simulate nasal mechanics and ultimately the interaction between geometry and airflow.

  8. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  9. Validation and Simulation of ARES I Scale Model Acoustic Test -1- Pathfinder Development

    NASA Technical Reports Server (NTRS)

    Putnam, G. C.

    2011-01-01

    The Ares I Scale Model Acoustics Test (ASMAT) is a series of live-fire tests of scaled rocket motors meant to simulate the conditions of the Ares I launch configuration. These tests have provided a well documented set of high fidelity measurements useful for validation including data taken over a range of test conditions and containing phenomena like Ignition Over-Pressure and water suppression of acoustics. To take advantage of this data, a digital representation of the ASMAT test setup has been constructed and test firings of the motor have been simulated using the Loci/CHEM computational fluid dynamics software. Within this first of a series of papers, results from ASMAT simulations with the rocket in a held down configuration and without water suppression have then been compared to acoustic data collected from similar live-fire tests to assess the accuracy of the simulations. Detailed evaluations of the mesh features, mesh length scales relative to acoustic signals, Courant-Friedrichs-Lewy numbers, and spatial residual sources have been performed to support this assessment. Results of acoustic comparisons have shown good correlation with the amplitude and temporal shape of pressure features and reasonable spectral accuracy up to approximately 1000 Hz. Major plume and acoustic features have been well captured including the plume shock structure, the igniter pulse transient, and the ignition overpressure. Finally, acoustic propagation patterns illustrated a previously unconsidered issue of tower placement inline with the high intensity overpressure propagation path.

  10. Modeling and Validation of Microwave Ablations with Internal Vaporization

    PubMed Central

    Chiang, Jason; Birla, Sohan; Bedoya, Mariajose; Jones, David; Subbiah, Jeyam; Brace, Christopher L.

    2014-01-01

    Numerical simulation is increasingly being utilized for computer-aided design of treatment devices, analysis of ablation growth, and clinical treatment planning. Simulation models to date have incorporated electromagnetic wave propagation and heat conduction, but not other relevant physics such as water vaporization and mass transfer. Such physical changes are particularly noteworthy during the intense heat generation associated with microwave heating. In this work, a numerical model was created that integrates microwave heating with water vapor generation and transport by using porous media assumptions in the tissue domain. The heating physics of the water vapor model was validated through temperature measurements taken at locations 5, 10 and 20 mm away from the heating zone of the microwave antenna in homogenized ex vivo bovine liver setup. Cross-sectional area of water vapor transport was validated through intra-procedural computed tomography (CT) during microwave ablations in homogenized ex vivo bovine liver. Iso-density contours from CT images were compared to vapor concentration contours from the numerical model at intermittent time points using the Jaccard Index. In general, there was an improving correlation in ablation size dimensions as the ablation procedure proceeded, with a Jaccard Index of 0.27, 0.49, 0.61, 0.67 and 0.69 at 1, 2, 3, 4, and 5 minutes. This study demonstrates the feasibility and validity of incorporating water vapor concentration into thermal ablation simulations and validating such models experimentally. PMID:25330481

  11. Experimental Validation of a Closed Brayton Cycle System Transient Simulation

    NASA Technical Reports Server (NTRS)

    Johnson, Paul K.; Hervol, David S.

    2006-01-01

    The Brayton Power Conversion Unit (BPCU) located at NASA Glenn Research Center (GRC) in Cleveland, Ohio was used to validate the results of a computational code known as Closed Cycle System Simulation (CCSS). Conversion system thermal transient behavior was the focus of this validation. The BPCU was operated at various steady state points and then subjected to transient changes involving shaft rotational speed and thermal energy input. These conditions were then duplicated in CCSS. Validation of the CCSS BPCU model provides confidence in developing future Brayton power system performance predictions, and helps to guide high power Brayton technology development.

  12. Virtual reality simulator training for laparoscopic colectomy: what metrics have construct validity?

    PubMed

    Shanmugan, Skandan; Leblanc, Fabien; Senagore, Anthony J; Ellis, C Neal; Stein, Sharon L; Khan, Sadaf; Delaney, Conor P; Champagne, Bradley J

    2014-02-01

    Virtual reality simulation for laparoscopic colectomy has been used for training of surgical residents and has been considered as a model for technical skills assessment of board-eligible colorectal surgeons. However, construct validity (the ability to distinguish between skill levels) must be confirmed before widespread implementation. This study was designed to specifically determine which metrics for laparoscopic sigmoid colectomy have evidence of construct validity. General surgeons that had performed fewer than 30 laparoscopic colon resections and laparoscopic colorectal experts (>200 laparoscopic colon resections) performed laparoscopic sigmoid colectomy on the LAP Mentor model. All participants received a 15-minute instructional warm-up and had never used the simulator before the study. Performance was then compared between each group for 21 metrics (procedural, 14; intraoperative errors, 7) to determine specifically which measurements demonstrate construct validity. Performance was compared with the Mann-Whitney U-test (p < 0.05 was significant). Fifty-three surgeons; 29 general surgeons, and 24 colorectal surgeons enrolled in the study. The virtual reality simulators for laparoscopic sigmoid colectomy demonstrated construct validity for 8 of 14 procedural metrics by distinguishing levels of surgical experience (p < 0.05). The most discriminatory procedural metrics (p < 0.01) favoring experts were reduced instrument path length, accuracy of the peritoneal/medial mobilization, and dissection of the inferior mesenteric artery. Intraoperative errors were not discriminatory for most metrics and favored general surgeons for colonic wall injury (general surgeons, 0.7; colorectal surgeons, 3.5; p = 0.045). Individual variability within the general surgeon and colorectal surgeon groups was not accounted for. The virtual reality simulators for laparoscopic sigmoid colectomy demonstrated construct validity for 8 procedure-specific metrics. However, using virtual

  13. Validation of a Monte Carlo model used for simulating tube current modulation in computed tomography over a wide range of phantom conditions/challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.

    2014-11-01

    Purpose: Monte Carlo (MC) simulation methods have been widely used in patient dosimetry in computed tomography (CT), including estimating patient organ doses. However, most simulation methods have undergone a limited set of validations, often using homogeneous phantoms with simple geometries. As clinical scanning has become more complex and the use of tube current modulation (TCM) has become pervasive in the clinic, MC simulations should include these techniques in their methodologies and therefore should also be validated using a variety of phantoms with different shapes and material compositions to result in a variety of differently modulated tube current profiles. The purposemore » of this work is to perform the measurements and simulations to validate a Monte Carlo model under a variety of test conditions where fixed tube current (FTC) and TCM were used. Methods: A previously developed MC model for estimating dose from CT scans that models TCM, built using the platform of MCNPX, was used for CT dose quantification. In order to validate the suitability of this model to accurately simulate patient dose from FTC and TCM CT scan, measurements and simulations were compared over a wide range of conditions. Phantoms used for testing range from simple geometries with homogeneous composition (16 and 32 cm computed tomography dose index phantoms) to more complex phantoms including a rectangular homogeneous water equivalent phantom, an elliptical shaped phantom with three sections (where each section was a homogeneous, but different material), and a heterogeneous, complex geometry anthropomorphic phantom. Each phantom requires varying levels of x-, y- and z-modulation. Each phantom was scanned on a multidetector row CT (Sensation 64) scanner under the conditions of both FTC and TCM. Dose measurements were made at various surface and depth positions within each phantom. Simulations using each phantom were performed for FTC, detailed x–y–z TCM, and z-axis-only TCM to

  14. Development and validation of the Simulation Learning Effectiveness Scale for nursing students.

    PubMed

    Pai, Hsiang-Chu

    2016-11-01

    To develop and validate the Simulation Learning Effectiveness Scale, which is based on Bandura's social cognitive theory. A simulation programme is a significant teaching strategy for nursing students. Nevertheless, there are few evidence-based instruments that validate the effectiveness of simulation learning in Taiwan. This is a quantitative descriptive design. In Study 1, a nonprobability convenience sample of 151 student nurses completed the Simulation Learning Effectiveness Scale. Exploratory factor analysis was used to examine the factor structure of the instrument. In Study 2, which involved 365 student nurses, confirmatory factor analysis and structural equation modelling were used to analyse the construct validity of the Simulation Learning Effectiveness Scale. In Study 1, exploratory factor analysis yielded three components: self-regulation, self-efficacy and self-motivation. The three factors explained 29·09, 27·74 and 19·32% of the variance, respectively. The final 12-item instrument with the three factors explained 76·15% of variance. Cronbach's alpha was 0·94. In Study 2, confirmatory factor analysis identified a second-order factor termed Simulation Learning Effectiveness Scale. Goodness-of-fit indices showed an acceptable fit overall with the full model (χ 2 /df (51) = 3·54, comparative fit index = 0·96, Tucker-Lewis index = 0·95 and standardised root-mean-square residual = 0·035). In addition, teacher's competence was found to encourage learning, and self-reflection and insight were significantly and positively associated with Simulation Learning Effectiveness Scale. Teacher's competence in encouraging learning also was significantly and positively associated with self-reflection and insight. Overall, theses variable explained 21·9% of the variance in the student's learning effectiveness. The Simulation Learning Effectiveness Scale is a reliable and valid means to assess simulation learning effectiveness for nursing students

  15. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    PubMed

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  16. SWAT model uncertainty analysis, calibration and validation for runoff simulation in the Luvuvhu River catchment, South Africa

    NASA Astrophysics Data System (ADS)

    Thavhana, M. P.; Savage, M. J.; Moeletsi, M. E.

    2018-06-01

    The soil and water assessment tool (SWAT) was calibrated for the Luvuvhu River catchment, South Africa in order to simulate runoff. The model was executed through QSWAT which is an interface between SWAT and QGIS. Data from four weather stations and four weir stations evenly distributed over the catchment were used. The model was run for a 33-year period of 1983-2015. Sensitivity analysis, calibration and validation were conducted using the sequential uncertainty fitting (SUFI-2) algorithm through its interface with SWAT calibration and uncertainty procedure (SWAT-CUP). The calibration process was conducted for the period 1986 to 2005 while the validation process was from 2006 to 2015. Six model efficiency measures were used, namely: coefficient of determination (R2), Nash-Sutcliffe efficiency (NSE) index, root mean square error (RMSE)-observations standard deviation ratio (RSR), percent bias (PBIAS), probability (P)-factor and correlation coefficient (R)-factor were used. Initial results indicated an over-estimation of low flows with regression slope of less than 0.7. Twelve model parameters were applied for sensitivity analysis with four (ALPHA_BF, CN2, GW_DELAY and SOL_K) found to be more distinguishable and sensitive to streamflow (p < 0.05). The SUFI-2 algorithm through the interface with the SWAT-CUP was capable of capturing the model's behaviour, with calibration results showing an R2 of 0.63, NSE index of 0.66, RSR of 0.56 and a positive PBIAS of 16.3 while validation results revealed an R2 of 0.52, NSE of 0.48, RSR of 0.72 and PBIAS of 19.90. The model produced P-factor of 0.67 and R-factor of 0.68 during calibration and during validation, 0.69 and 0.53 respectively. Although performance indicators yielded fair and acceptable results, the P-factor was still below the recommended model performance of 70%. Apart from the unacceptable P-factor values, the results obtained in this study demonstrate acceptable model performance during calibration while

  17. Validation of a DICE Simulation Against a Discrete Event Simulation Implemented Entirely in Code.

    PubMed

    Möller, Jörgen; Davis, Sarah; Stevenson, Matt; Caro, J Jaime

    2017-10-01

    Modeling is an essential tool for health technology assessment, and various techniques for conceptualizing and implementing such models have been described. Recently, a new method has been proposed-the discretely integrated condition event or DICE simulation-that enables frequently employed approaches to be specified using a common, simple structure that can be entirely contained and executed within widely available spreadsheet software. To assess if a DICE simulation provides equivalent results to an existing discrete event simulation, a comparison was undertaken. A model of osteoporosis and its management programmed entirely in Visual Basic for Applications and made public by the National Institute for Health and Care Excellence (NICE) Decision Support Unit was downloaded and used to guide construction of its DICE version in Microsoft Excel ® . The DICE model was then run using the same inputs and settings, and the results were compared. The DICE version produced results that are nearly identical to the original ones, with differences that would not affect the decision direction of the incremental cost-effectiveness ratios (<1% discrepancy), despite the stochastic nature of the models. The main limitation of the simple DICE version is its slow execution speed. DICE simulation did not alter the results and, thus, should provide a valid way to design and implement decision-analytic models without requiring specialized software or custom programming. Additional efforts need to be made to speed up execution.

  18. The Roles of Verification, Validation and Uncertainty Quantification in the NASA Standard for Models and Simulations

    NASA Technical Reports Server (NTRS)

    Zang, Thomas A.; Luckring, James M.; Morrison, Joseph H.; Blattnig, Steve R.; Green, Lawrence L.; Tripathi, Ram K.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) recently issued an interim version of the Standard for Models and Simulations (M&S Standard) [1]. The action to develop the M&S Standard was identified in an internal assessment [2] of agency-wide changes needed in the wake of the Columbia Accident [3]. The primary goal of this standard is to ensure that the credibility of M&S results is properly conveyed to those making decisions affecting human safety or mission success criteria. The secondary goal is to assure that the credibility of the results from models and simulations meets the project requirements (for credibility). This presentation explains the motivation and key aspects of the M&S Standard, with a special focus on the requirements for verification, validation and uncertainty quantification. Some pilot applications of this standard to computational fluid dynamics applications will be provided as illustrations. The authors of this paper are the members of the team that developed the initial three drafts of the standard, the last of which benefited from extensive comments from most of the NASA Centers. The current version (number 4) incorporates modifications made by a team representing 9 of the 10 NASA Centers. A permanent version of the M&S Standard is expected by December 2007. The scope of the M&S Standard is confined to those uses of M&S that support program and project decisions that may affect human safety or mission success criteria. Such decisions occur, in decreasing order of importance, in the operations, the test & evaluation, and the design & analysis phases. Requirements are placed on (1) program and project management, (2) models, (3) simulations and analyses, (4) verification, validation and uncertainty quantification (VV&UQ), (5) recommended practices, (6) training, (7) credibility assessment, and (8) reporting results to decision makers. A key component of (7) and (8) is the use of a Credibility Assessment Scale, some of the details

  19. Modeling and Simulation at NASA

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2009-01-01

    This slide presentation is composed of two topics. The first reviews the use of modeling and simulation (M&S) particularly as it relates to the Constellation program and discrete event simulation (DES). DES is defined as a process and system analysis, through time-based and resource constrained probabilistic simulation models, that provide insight into operation system performance. The DES shows that the cycles for a launch from manufacturing and assembly to launch and recovery is about 45 days and that approximately 4 launches per year are practicable. The second topic reviews a NASA Standard for Modeling and Simulation. The Columbia Accident Investigation Board made some recommendations related to models and simulations. Some of the ideas inherent in the new standard are the documentation of M&S activities, an assessment of the credibility, and reporting to decision makers, which should include the analysis of the results, a statement as to the uncertainty in the results,and the credibility of the results. There is also discussion about verification and validation (V&V) of models. There is also discussion about the different types of models and simulation.

  20. Predictive Validation of an Influenza Spread Model

    PubMed Central

    Hyder, Ayaz; Buckeridge, David L.; Leung, Brian

    2013-01-01

    Background Modeling plays a critical role in mitigating impacts of seasonal influenza epidemics. Complex simulation models are currently at the forefront of evaluating optimal mitigation strategies at multiple scales and levels of organization. Given their evaluative role, these models remain limited in their ability to predict and forecast future epidemics leading some researchers and public-health practitioners to question their usefulness. The objective of this study is to evaluate the predictive ability of an existing complex simulation model of influenza spread. Methods and Findings We used extensive data on past epidemics to demonstrate the process of predictive validation. This involved generalizing an individual-based model for influenza spread and fitting it to laboratory-confirmed influenza infection data from a single observed epidemic (1998–1999). Next, we used the fitted model and modified two of its parameters based on data on real-world perturbations (vaccination coverage by age group and strain type). Simulating epidemics under these changes allowed us to estimate the deviation/error between the expected epidemic curve under perturbation and observed epidemics taking place from 1999 to 2006. Our model was able to forecast absolute intensity and epidemic peak week several weeks earlier with reasonable reliability and depended on the method of forecasting-static or dynamic. Conclusions Good predictive ability of influenza epidemics is critical for implementing mitigation strategies in an effective and timely manner. Through the process of predictive validation applied to a current complex simulation model of influenza spread, we provided users of the model (e.g. public-health officials and policy-makers) with quantitative metrics and practical recommendations on mitigating impacts of seasonal influenza epidemics. This methodology may be applied to other models of communicable infectious diseases to test and potentially improve their predictive

  1. Predictive Model for Particle Residence Time Distributions in Riser Reactors. Part 1: Model Development and Validation

    DOE PAGES

    Foust, Thomas D.; Ziegler, Jack L.; Pannala, Sreekanth; ...

    2017-02-28

    Here in this computational study, we model the mixing of biomass pyrolysis vapor with solid catalyst in circulating riser reactors with a focus on the determination of solid catalyst residence time distributions (RTDs). A comprehensive set of 2D and 3D simulations were conducted for a pilot-scale riser using the Eulerian-Eulerian two-fluid modeling framework with and without sub-grid-scale models for the gas-solids interaction. A validation test case was also simulated and compared to experiments, showing agreement in the pressure gradient and RTD mean and spread. For simulation cases, it was found that for accurate RTD prediction, the Johnson and Jackson partialmore » slip solids boundary condition was required for all models and a sub-grid model is useful so that ultra high resolutions grids that are very computationally intensive are not required. Finally, we discovered a 2/3 scaling relation for the RTD mean and spread when comparing resolved 2D simulations to validated unresolved 3D sub-grid-scale model simulations.« less

  2. Validation of a Full-Immersion Simulation Platform for Percutaneous Nephrolithotomy Using Three-Dimensional Printing Technology.

    PubMed

    Ghazi, Ahmed; Campbell, Timothy; Melnyk, Rachel; Feng, Changyong; Andrusco, Alex; Stone, Jonathan; Erturk, Erdal

    2017-12-01

    The restriction of resident hours with an increasing focus on patient safety and a reduced caseload has impacted surgical training. A complex and complication prone procedure such as percutaneous nephrolithotomy (PCNL) with a steep learning curve may create an unsafe environment for hands-on resident training. In this study, we validate a high fidelity, inanimate PCNL model within a full-immersion simulation environment. Anatomically correct models of the human pelvicaliceal system, kidney, and relevant adjacent structures were created using polyvinyl alcohol hydrogels and three-dimensional-printed injection molds. All steps of a PCNL were simulated including percutaneous renal access, nephroscopy, and lithotripsy. Five experts (>100 caseload) and 10 novices (<20 caseload) from both urology (full procedure) and interventional radiology (access only) departments completed the simulation. Face and content validity were calculated using model ratings for similarity to the real procedure and usefulness as a training tool. Differences in performance among groups with various levels of experience using clinically relevant procedural metrics were used to calculate construct validity. The model was determined to have an excellent face and content validity with an average score of 4.5/5.0 and 4.6/5.0, respectively. There were significant differences between novice and expert operative metrics including mean fluoroscopy time, the number of percutaneous access attempts, and number of times the needle was repositioned. Experts achieved better stone clearance with fewer procedural complications. We demonstrated the face, content, and construct validity of an inanimate, full task trainer for PCNL. Construct validity between experts and novices was demonstrated using incorporated procedural metrics, which permitted the accurate assessment of performance. While hands-on training under supervision remains an integral part of any residency, this full-immersion simulation provides a

  3. Validation of Robotic Surgery Simulator (RoSS).

    PubMed

    Kesavadas, Thenkurussi; Stegemann, Andrew; Sathyaseelan, Gughan; Chowriappa, Ashirwad; Srimathveeravalli, Govindarajan; Seixas-Mikelus, Stéfanie; Chandrasekhar, Rameella; Wilding, Gregory; Guru, Khurshid

    2011-01-01

    Recent growth of daVinci Robotic Surgical System as a minimally invasive surgery tool has led to a call for better training of future surgeons. In this paper, a new virtual reality simulator, called RoSS is presented. Initial results from two studies - face and content validity, are very encouraging. 90% of the cohort of expert robotic surgeons felt that the simulator was excellent or somewhat close to the touch and feel of the daVinci console. Content validity of the simulator received 90% approval in some cases. These studies demonstrate that RoSS has the potential of becoming an important training tool for the daVinci surgical robot.

  4. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  5. Validation of Finite Element Crash Test Dummy Models for Predicting Orion Crew Member Injuries During a Simulated Vehicle Landing

    NASA Technical Reports Server (NTRS)

    Tabiei, Al; Lawrence, Charles; Fasanella, Edwin L.

    2009-01-01

    A series of crash tests were conducted with dummies during simulated Orion crew module landings at the Wright-Patterson Air Force Base. These tests consisted of several crew configurations with and without astronaut suits. Some test results were collected and are presented. In addition, finite element models of the tests were developed and are presented. The finite element models were validated using the experimental data, and the test responses were compared with the computed results. Occupant crash data, such as forces, moments, and accelerations, were collected from the simulations and compared with injury criteria to assess occupant survivability and injury. Some of the injury criteria published in the literature is summarized for completeness. These criteria were used to determine potential injury during crew impact events.

  6. The validity of flow approximations when simulating catchment-integrated flash floods

    NASA Astrophysics Data System (ADS)

    Bout, B.; Jetten, V. G.

    2018-01-01

    Within hydrological models, flow approximations are commonly used to reduce computation time. The validity of these approximations is strongly determined by flow height, flow velocity and the spatial resolution of the model. In this presentation, the validity and performance of the kinematic, diffusive and dynamic flow approximations are investigated for use in a catchment-based flood model. Particularly, the validity during flood events and for varying spatial resolutions is investigated. The OpenLISEM hydrological model is extended to implement both these flow approximations and channel flooding based on dynamic flow. The flow approximations are used to recreate measured discharge in three catchments, among which is the hydrograph of the 2003 flood event in the Fella river basin. Furthermore, spatial resolutions are varied for the flood simulation in order to investigate the influence of spatial resolution on these flow approximations. Results show that the kinematic, diffusive and dynamic flow approximation provide least to highest accuracy, respectively, in recreating measured discharge. Kinematic flow, which is commonly used in hydrological modelling, substantially over-estimates hydrological connectivity in the simulations with a spatial resolution of below 30 m. Since spatial resolutions of models have strongly increased over the past decades, usage of routed kinematic flow should be reconsidered. The combination of diffusive or dynamic overland flow and dynamic channel flooding provides high accuracy in recreating the 2003 Fella river flood event. Finally, in the case of flood events, spatial modelling of kinematic flow substantially over-estimates hydrological connectivity and flow concentration since pressure forces are removed, leading to significant errors.

  7. Longitudinal train dynamics model for a rail transit simulation system

    DOE PAGES

    Wang, Jinghui; Rakha, Hesham A.

    2018-01-01

    The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less

  8. Longitudinal train dynamics model for a rail transit simulation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jinghui; Rakha, Hesham A.

    The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less

  9. Comparison of validation methods for forming simulations

    NASA Astrophysics Data System (ADS)

    Schug, Alexander; Kapphan, Gabriel; Bardl, Georg; Hinterhölzl, Roland; Drechsler, Klaus

    2018-05-01

    The forming simulation of fibre reinforced thermoplastics could reduce the development time and improve the forming results. But to take advantage of the full potential of the simulations it has to be ensured that the predictions for material behaviour are correct. For that reason, a thorough validation of the material model has to be conducted after characterising the material. Relevant aspects for the validation of the simulation are for example the outer contour, the occurrence of defects and the fibre paths. To measure these features various methods are available. Most relevant and also most difficult to measure are the emerging fibre orientations. For that reason, the focus of this study was on measuring this feature. The aim was to give an overview of the properties of different measuring systems and select the most promising systems for a comparison survey. Selected were an optical, an eddy current and a computer-assisted tomography system with the focus on measuring the fibre orientations. Different formed 3D parts made of unidirectional glass fibre and carbon fibre reinforced thermoplastics were measured. Advantages and disadvantages of the tested systems were revealed. Optical measurement systems are easy to use, but are limited to the surface plies. With an eddy current system also lower plies can be measured, but it is only suitable for carbon fibres. Using a computer-assisted tomography system all plies can be measured, but the system is limited to small parts and challenging to evaluate.

  10. Experimental validation of finite element model analysis of a steel frame in simulated post-earthquake fire environments

    NASA Astrophysics Data System (ADS)

    Huang, Ying; Bevans, W. J.; Xiao, Hai; Zhou, Zhi; Chen, Genda

    2012-04-01

    During or after an earthquake event, building system often experiences large strains due to shaking effects as observed during recent earthquakes, causing permanent inelastic deformation. In addition to the inelastic deformation induced by the earthquake effect, the post-earthquake fires associated with short fuse of electrical systems and leakage of gas devices can further strain the already damaged structures during the earthquakes, potentially leading to a progressive collapse of buildings. Under these harsh environments, measurements on the involved building by various sensors could only provide limited structural health information. Finite element model analysis, on the other hand, if validated by predesigned experiments, can provide detail structural behavior information of the entire structures. In this paper, a temperature dependent nonlinear 3-D finite element model (FEM) of a one-story steel frame is set up by ABAQUS based on the cited material property of steel from EN 1993-1.2 and AISC manuals. The FEM is validated by testing the modeled steel frame in simulated post-earthquake environments. Comparisons between the FEM analysis and the experimental results show that the FEM predicts the structural behavior of the steel frame in post-earthquake fire conditions reasonably. With experimental validations, the FEM analysis of critical structures could be continuously predicted for structures in these harsh environments for a better assistant to fire fighters in their rescue efforts and save fire victims.

  11. The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment

    NASA Technical Reports Server (NTRS)

    Clement, Bradley J.; Frank, Jeremy D.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    A principal obstacle to fielding automated planning systems is the difficulty of modeling. Physical systems are modeled conventionally based on specification documents and the modeler's understanding of the system. Thus, the model is developed in a way that is disconnected from the system's actual behavior and is vulnerable to manual error. Another obstacle to fielding planners is testing and validation. For a space mission, generated plans must be validated often by translating them into command sequences that are run in a simulation testbed. Testing in this way is complex and onerous because of the large number of possible plans and states of the spacecraft. Though, if used as a source of domain knowledge, the simulator can ease validation. This paper poses a challenge: to ground planning models in the system physics represented by simulation. A proposed, interactive model development environment illustrates the integration of planning and simulation to meet the challenge. This integration reveals research paths for automated model construction and validation.

  12. Validation of Potential Models for Li2O in Classical Molecular Dynamics Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oda, Takuji; Oya, Yasuhisa; Tanaka, Satoru

    2007-08-01

    Four Buckingham-type pairwise potential models for Li2O were assessed by molecular static and dynamics simulations. In the static simulation, all models afforded acceptable agreement with experimental values and ab initio calculation results for the crystalline properties. Moreover, the superionic phase transition was realized in the dynamics simulation. However, the Li diffusivity and the lattice expansion were not adequately reproduced at the same time by any model. When using these models in future radiation simulation, these features should be taken into account, in order to reduce the model dependency of the results.

  13. Using the split Hopkinson pressure bar to validate material models.

    PubMed

    Church, Philip; Cornish, Rory; Cullis, Ian; Gould, Peter; Lewtas, Ian

    2014-08-28

    This paper gives a discussion of the use of the split-Hopkinson bar with particular reference to the requirements of materials modelling at QinetiQ. This is to deploy validated material models for numerical simulations that are physically based and have as little characterization overhead as possible. In order to have confidence that the models have a wide range of applicability, this means, at most, characterizing the models at low rate and then validating them at high rate. The split Hopkinson pressure bar (SHPB) is ideal for this purpose. It is also a very useful tool for analysing material behaviour under non-shock wave loading. This means understanding the output of the test and developing techniques for reliable comparison of simulations with SHPB data. For materials other than metals comparison with an output stress v strain curve is not sufficient as the assumptions built into the classical analysis are generally violated. The method described in this paper compares the simulations with as much validation data as can be derived from deployed instrumentation including the raw strain gauge data on the input and output bars, which avoids any assumptions about stress equilibrium. One has to take into account Pochhammer-Chree oscillations and their effect on the specimen and recognize that this is itself also a valuable validation test of the material model. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  14. Development and Validation of a Monte Carlo Simulation Tool for Multi-Pinhole SPECT

    PubMed Central

    Mok, Greta S. P.; Du, Yong; Wang, Yuchuan; Frey, Eric C.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose In this work, we developed and validated a Monte Carlo simulation (MCS) tool for investigation and evaluation of multi-pinhole (MPH) SPECT imaging. Procedures This tool was based on a combination of the SimSET and MCNP codes. Photon attenuation and scatter in the object, as well as penetration and scatter through the collimator detector, are modeled in this tool. It allows accurate and efficient simulation of MPH SPECT with focused pinhole apertures and user-specified photon energy, aperture material, and imaging geometry. The MCS method was validated by comparing the point response function (PRF), detection efficiency (DE), and image profiles obtained from point sources and phantom experiments. A prototype single-pinhole collimator and focused four- and five-pinhole collimators fitted on a small animal imager were used for the experimental validations. We have also compared computational speed among various simulation tools for MPH SPECT, including SimSET-MCNP, MCNP, SimSET-GATE, and GATE for simulating projections of a hot sphere phantom. Results We found good agreement between the MCS and experimental results for PRF, DE, and image profiles, indicating the validity of the simulation method. The relative computational speeds for SimSET-MCNP, MCNP, SimSET-GATE, and GATE are 1: 2.73: 3.54: 7.34, respectively, for 120-view simulations. We also demonstrated the application of this MCS tool in small animal imaging by generating a set of low-noise MPH projection data of a 3D digital mouse whole body phantom. Conclusions The new method is useful for studying MPH collimator designs, data acquisition protocols, image reconstructions, and compensation techniques. It also has great potential to be applied for modeling the collimator-detector response with penetration and scatter effects for MPH in the quantitative reconstruction method. PMID:19779896

  15. Development and validation of a new soot formation model for gas turbine combustor simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Domenico, Massimiliano; Gerlinger, Peter; Aigner, Manfred

    2010-02-15

    In this paper a new soot formation model for gas turbine combustor simulations is presented. A sectional approach for the description of Polycyclic Aromatic Hydrocarbons (PAHs) and a two-equation model for soot particle dynamics are introduced. By including the PAH chemistry the formulation becomes more general in that the soot formation is neither directly linked to the fuel nor to C{sub 2}-like species, as it is the case in simpler soot models currently available for CFD applications. At the same time, the sectional approach for the PAHs keeps the required computational resources low if compared to models based on amore » detailed description of the PAH kinetics. These features of the new model allow an accurate yet affordable calculation of soot in complex gas turbine combustion chambers. A careful model validation will be presented for diffusion and partially premixed flames. Fuels ranging from methane to kerosene are investigated. Thus, flames with different sooting characteristics are covered. An excellent agreement with experimental data is achieved for all configurations investigated. A fundamental feature of the new model is that with a single set of constants it is able to accurately describe the soot dynamics of different fuels at different operating conditions. (author)« less

  16. Use case driven approach to develop simulation model for PCS of APR1400 simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang

    2006-07-01

    The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less

  17. Validation of SWAT+ at field level and comparison with previous SWAT models in simulating hydrologic quantity

    NASA Astrophysics Data System (ADS)

    GAO, J.; White, M. J.; Bieger, K.; Yen, H.; Arnold, J. G.

    2017-12-01

    Over the past 20 years, the Soil and Water Assessment Tool (SWAT) has been adopted by many researches to assess water quantity and quality in watersheds around the world. As the demand increases in facilitating model support, maintenance, and future development, the SWAT source code and data have undergone major modifications over the past few years. To make the model more flexible in terms of interactions of spatial units and processes occurring in watersheds, a completely revised version of SWAT (SWAT+) was developed to improve SWAT's ability in water resource modelling and management. There are only several applications of SWAT+ in large watersheds, however, no study pays attention to validate the new model at field level and assess its performance. To test the basic hydrologic function of SWAT+, it was implemented in five field cases across five states in the U.S. and compared the SWAT+ created results with that from the previous models at the same fields. Additionally, an automatic calibration tool was used to test which model is easier to be calibrated well in a limited number of parameter adjustments. The goal of the study was to evaluate the performance of SWAT+ in simulating stream flow on field level at different geographical locations. The results demonstrate that SWAT+ demonstrated similar performance with previous SWAT model, but the flexibility offered by SWAT+ via the connection of different spatial objects can result in a more accurate simulation of hydrological processes in spatial, especially for watershed with artificial facilities. Autocalibration shows that SWAT+ is much easier to obtain a satisfied result compared with the previous SWAT. Although many capabilities have already been enhanced in SWAT+, there exist inaccuracies in simulation. This insufficiency will be improved with advancements in scientific knowledge on hydrologic process in specific watersheds. Currently, SWAT+ is prerelease, and any errors are being addressed.

  18. Si amorphization by focused ion beam milling: Point defect model with dynamic BCA simulation and experimental validation.

    PubMed

    Huang, J; Loeffler, M; Muehle, U; Moeller, W; Mulders, J J L; Kwakman, L F Tz; Van Dorp, W F; Zschech, E

    2018-01-01

    A Ga focused ion beam (FIB) is often used in transmission electron microscopy (TEM) analysis sample preparation. In case of a crystalline Si sample, an amorphous near-surface layer is formed by the FIB process. In order to optimize the FIB recipe by minimizing the amorphization, it is important to predict the amorphous layer thickness from simulation. Molecular Dynamics (MD) simulation has been used to describe the amorphization, however, it is limited by computational power for a realistic FIB process simulation. On the other hand, Binary Collision Approximation (BCA) simulation is able and has been used to simulate ion-solid interaction process at a realistic scale. In this study, a Point Defect Density approach is introduced to a dynamic BCA simulation, considering dynamic ion-solid interactions. We used this method to predict the c-Si amorphization caused by FIB milling on Si. To validate the method, dedicated TEM studies are performed. It shows that the amorphous layer thickness predicted by the numerical simulation is consistent with the experimental data. In summary, the thickness of the near-surface Si amorphization layer caused by FIB milling can be well predicted using the Point Defect Density approach within the dynamic BCA model. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. On validating remote sensing simulations using coincident real data

    NASA Astrophysics Data System (ADS)

    Wang, Mingming; Yao, Wei; Brown, Scott; Goodenough, Adam; van Aardt, Jan

    2016-05-01

    The remote sensing community often requires data simulation, either via spectral/spatial downsampling or through virtual, physics-based models, to assess systems and algorithms. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model is one such first-principles, physics-based model for simulating imagery for a range of modalities. Complex simulation of vegetation environments subsequently has become possible, as scene rendering technology and software advanced. This in turn has created questions related to the validity of such complex models, with potential multiple scattering, bidirectional distribution function (BRDF), etc. phenomena that could impact results in the case of complex vegetation scenes. We selected three sites, located in the Pacific Southwest domain (Fresno, CA) of the National Ecological Observatory Network (NEON). These sites represent oak savanna, hardwood forests, and conifer-manzanita-mixed forests. We constructed corresponding virtual scenes, using airborne LiDAR and imaging spectroscopy data from NEON, ground-based LiDAR data, and field-collected spectra to characterize the scenes. Imaging spectroscopy data for these virtual sites then were generated using the DIRSIG simulation environment. This simulated imagery was compared to real AVIRIS imagery (15m spatial resolution; 12 pixels/scene) and NEON Airborne Observation Platform (AOP) data (1m spatial resolution; 180 pixels/scene). These tests were performed using a distribution-comparison approach for select spectral statistics, e.g., established the spectra's shape, for each simulated versus real distribution pair. The initial comparison results of the spectral distributions indicated that the shapes of spectra between the virtual and real sites were closely matched.

  20. Dynamic Simulation of Human Gait Model With Predictive Capability.

    PubMed

    Sun, Jinming; Wu, Shaoli; Voglewede, Philip A

    2018-03-01

    In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.

  1. Validation of a 3D computational fluid-structure interaction model simulating flow through an elastic aperture.

    PubMed

    Quaini, A; Canic, S; Glowinski, R; Igo, S; Hartley, C J; Zoghbi, W; Little, S

    2012-01-10

    This work presents a validation of a fluid-structure interaction computational model simulating the flow conditions in an in vitro mock heart chamber modeling mitral valve regurgitation during the ejection phase during which the trans-valvular pressure drop and valve displacement are not as large. The mock heart chamber was developed to study the use of 2D and 3D color Doppler techniques in imaging the clinically relevant complex intra-cardiac flow events associated with mitral regurgitation. Computational models are expected to play an important role in supporting, refining, and reinforcing the emerging 3D echocardiographic applications. We have developed a 3D computational fluid-structure interaction algorithm based on a semi-implicit, monolithic method, combined with an arbitrary Lagrangian-Eulerian approach to capture the fluid domain motion. The mock regurgitant mitral valve corresponding to an elastic plate with a geometric orifice, was modeled using 3D elasticity, while the blood flow was modeled using the 3D Navier-Stokes equations for an incompressible, viscous fluid. The two are coupled via the kinematic and dynamic conditions describing the two-way coupling. The pressure, the flow rate, and orifice plate displacement were measured and compared with numerical simulation results. In-line flow meter was used to measure the flow, pressure transducers were used to measure the pressure, and a Doppler method developed by one of the authors was used to measure the axial displacement of the orifice plate. The maximum recorded difference between experiment and numerical simulation for the flow rate was 4%, the pressure 3.6%, and for the orifice displacement 15%, showing excellent agreement between the two. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reboud, C.; Premel, D.; Lesselier, D.

    2007-03-21

    Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.

  3. Recent Advances in Simulation of Eddy Current Testing of Tubes and Experimental Validations

    NASA Astrophysics Data System (ADS)

    Reboud, C.; Prémel, D.; Lesselier, D.; Bisiaux, B.

    2007-03-01

    Eddy current testing (ECT) is widely used in iron and steel industry for the inspection of tubes during manufacturing. A collaboration between CEA and the Vallourec Research Center led to the development of new numerical functionalities dedicated to the simulation of ECT of non-magnetic tubes by external probes. The achievement of experimental validations led us to the integration of these models into the CIVA platform. Modeling approach and validation results are discussed here. A new numerical scheme is also proposed in order to improve the accuracy of the model.

  4. Turbofan Engine Post-Instability Behavior - Computer Simulations, Test Validation, and Application of Simulations,

    DTIC Science & Technology

    COMPRESSORS, *AIR FLOW, TURBOFAN ENGINES , TRANSIENTS, SURGES, STABILITY, COMPUTERIZED SIMULATION, EXPERIMENTAL DATA, VALIDATION, DIGITAL SIMULATION, INLET GUIDE VANES , ROTATION, STALLING, RECOVERY, HYSTERESIS

  5. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  6. Validation of virtual-reality-based simulations for endoscopic sinus surgery.

    PubMed

    Dharmawardana, N; Ruthenbeck, G; Woods, C; Elmiyeh, B; Diment, L; Ooi, E H; Reynolds, K; Carney, A S

    2015-12-01

    Virtual reality (VR) simulators provide an alternative to real patients for practicing surgical skills but require validation to ensure accuracy. Here, we validate the use of a virtual reality sinus surgery simulator with haptic feedback for training in Otorhinolaryngology - Head & Neck Surgery (OHNS). Participants were recruited from final-year medical students, interns, resident medical officers (RMOs), OHNS registrars and consultants. All participants completed an online questionnaire after performing four separate simulation tasks. These were then used to assess face, content and construct validity. anova with post hoc correlation was used for statistical analysis. The following groups were compared: (i) medical students/interns, (ii) RMOs, (iii) registrars and (iv) consultants. Face validity results had a statistically significant (P < 0.05) difference between the consultant group and others, while there was no significant difference between medical student/intern and RMOs. Variability within groups was not significant. Content validity results based on consultant scoring and comments indicated that the simulations need further development in several areas to be effective for registrar-level teaching. However, students, interns and RMOs indicated that the simulations provide a useful tool for learning OHNS-related anatomy and as an introduction to ENT-specific procedures. The VR simulations have been validated for teaching sinus anatomy and nasendoscopy to medical students, interns and RMOs. However, they require further development before they can be regarded as a valid tool for more advanced surgical training. © 2015 John Wiley & Sons Ltd.

  7. Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William

    2006-01-01

    The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.

  8. Adjustment and validation of a simulation tool for CSP plants based on parabolic trough technology

    NASA Astrophysics Data System (ADS)

    García-Barberena, Javier; Ubani, Nora

    2016-05-01

    The present work presents the validation process carried out for a simulation tool especially designed for the energy yield assessment of concentrating solar plants based on parabolic through (PT) technology. The validation has been carried out by comparing the model estimations with real data collected from a commercial CSP plant. In order to adjust the model parameters used for the simulation, 12 different days were selected among one-year of operational data measured at the real plant. The 12 days were simulated and the estimations compared with the measured data, focusing on the most important variables from the simulation point of view: temperatures, pressures and mass flow of the solar field, gross power, parasitic power, and net power delivered by the plant. Based on these 12 days, the key parameters for simulating the model were properly fixed and the simulation of a whole year performed. The results obtained for a complete year simulation showed very good agreement for the gross and net electric total production. The estimations for these magnitudes show a 1.47% and 2.02% BIAS respectively. The results proved that the simulation software describes with great accuracy the real operation of the power plant and correctly reproduces its transient behavior.

  9. Development and implementation of centralized simulation training: evaluation of feasibility, acceptability and construct validity.

    PubMed

    Shamim Khan, Mohammad; Ahmed, Kamran; Gavazzi, Andrea; Gohil, Rishma; Thomas, Libby; Poulsen, Johan; Ahmed, Munir; Jaye, Peter; Dasgupta, Prokar

    2013-03-01

    WHAT'S KNOWN ON THE SUBJECT? AND WHAT DOES THE STUDY ADD?: A competent urologist should not only have effective technical skills, but also other attributes that would make him/her a complete surgeon. These include team-working, communication and decision-making skills. Although evidence for effectiveness of simulation exists for individual simulators, there is a paucity of evidence for utility and effectiveness of these simulators in training programmes that aims to combine technical and non-technical skills training. This article explains the process of development and validation of a centrally coordinated simulation program (Participants - South-East Region Specialist Registrars) under the umbrella of the British Association for Urological Surgeons (BAUS) and the London Deanery. This program incorporated training of both technical (synthetic, animal and virtual reality models) and non-technical skills (simulated operating theatres). To establish the feasibility and acceptability of a centralized, simulation-based training-programme. Simulation is increasingly establishing its role in urological training, with two areas that are relevant to urologists: (i) technical skills and (ii) non-technical skills. For this London Deanery supported pilot Simulation and Technology enhanced Learning Initiative (STeLI) project, we developed a structured multimodal simulation training programme. The programme incorporated: (i) technical skills training using virtual-reality simulators (Uro-mentor and Perc-mentor [Symbionix, Cleveland, OH, USA], Procedicus MIST-Nephrectomy [Mentice, Gothenburg, Sweden] and SEP Robotic simulator [Sim Surgery, Oslo, Norway]); bench-top models (synthetic models for cystocopy, transurethral resection of the prostate, transurethral resection of bladder tumour, ureteroscopy); and a European (Aalborg, Denmark) wet-lab training facility; as well as (ii) non-technical skills/crisis resource management (CRM), using SimMan (Laerdal Medical Ltd, Orpington, UK

  10. Comparison of simulator fidelity model predictions with in-simulator evaluation data

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Mckissick, B. T.; Ashworth, B. R.

    1983-01-01

    A full factorial in simulator experiment of a single axis, multiloop, compensatory pitch tracking task is described. The experiment was conducted to provide data to validate extensions to an analytic, closed loop model of a real time digital simulation facility. The results of the experiment encompassing various simulation fidelity factors, such as visual delay, digital integration algorithms, computer iteration rates, control loading bandwidths and proprioceptive cues, and g-seat kinesthetic cues, are compared with predictions obtained from the analytic model incorporating an optimal control model of the human pilot. The in-simulator results demonstrate more sensitivity to the g-seat and to the control loader conditions than were predicted by the model. However, the model predictions are generally upheld, although the predicted magnitudes of the states and of the error terms are sometimes off considerably. Of particular concern is the large sensitivity difference for one control loader condition, as well as the model/in-simulator mismatch in the magnitude of the plant states when the other states match.

  11. Methodology for turbulence code validation: Quantification of simulation-experiment agreement and application to the TORPEX experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, Paolo; Theiler, C.; Fasoli, A.

    A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less

  12. Performance evaluation of an agent-based occupancy simulation model

    DOE PAGES

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing; ...

    2017-01-17

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  13. Performance evaluation of an agent-based occupancy simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  14. Construct validation of a novel hybrid surgical simulator.

    PubMed

    Broe, D; Ridgway, P F; Johnson, S; Tierney, S; Conlon, K C

    2006-06-01

    Simulated minimal access surgery has improved recently as both a learning and assessment tool. The construct validation of a novel simulator, ProMis, is described for use by residents in training. ProMis is a surgical simulator that can design tasks in both virtual and actual reality. A pilot group of surgical residents ranging from novice to expert completed three standardized tasks: orientation, dissection, and basic suturing. The tasks were tested for construct validity. Two experienced surgeons examined the recorded tasks in a blinded fashion using an objective structured assessment of technical skills format (OSATS: task-specific checklist and global rating score) as well as metrics delivered by the simulator. The findings showed excellent interrater reliability (Cronbach's alpha of 0.88 for the checklist and 0.93 for the global rating). The median scores in the experience groups were statistically different in both the global rating and the task-specific checklists (p < 0.05). The scores for the orientation task alone did not reach significance (p = 0.1), suggesting that modification is required before ProMis could be used in isolation as an assessment tool. The three simulated tasks in combination are construct valid for differentiating experience levels among surgeons in training. This hybrid simulator has potential added benefits of marrying the virtual with actual, and of combining simple box traits and advanced virtual reality simulation.

  15. Validation and Simulation of Ares I Scale Model Acoustic Test - 2 - Simulations at 5 Foot Elevation for Evaluation of Launch Mount Effects

    NASA Technical Reports Server (NTRS)

    Strutzenberg, Louise L.; Putman, Gabriel C.

    2011-01-01

    The Ares I Scale Model Acoustics Test (ASMAT) is a series of live-fire tests of scaled rocket motors meant to simulate the conditions of the Ares I launch configuration. These tests have provided a well documented set of high fidelity measurements useful for validation including data taken over a range of test conditions and containing phenomena like Ignition Over-Pressure and water suppression of acoustics. Expanding from initial simulations of the ASMAT setup in a held down configuration, simulations have been performed using the Loci/CHEM computational fluid dynamics software for ASMAT tests of the vehicle at 5 ft. elevation (100 ft. real vehicle elevation) with worst case drift in the direction of the launch tower. These tests have been performed without water suppression and have compared the acoustic emissions for launch structures with and without launch mounts. In addition, simulation results have also been compared to acoustic and imagery data collected from similar live-fire tests to assess the accuracy of the simulations. Simulations have shown a marked change in the pattern of emissions after removal of the launch mount with a reduction in the overall acoustic environment experienced by the vehicle and the formation of highly directed acoustic waves moving across the platform deck. Comparisons of simulation results to live-fire test data showed good amplitude and temporal correlation and imagery comparisons over the visible and infrared wavelengths showed qualitative capture of all plume and pressure wave evolution features.

  16. Highlights of Transient Plume Impingement Model Validation and Applications

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael

    2011-01-01

    This paper describes highlights of an ongoing validation effort conducted to assess the viability of applying a set of analytic point source transient free molecule equations to model behavior ranging from molecular effusion to rocket plumes. The validation effort includes encouraging comparisons to both steady and transient studies involving experimental data and direct simulation Monte Carlo results. Finally, this model is applied to describe features of two exotic transient scenarios involving NASA Goddard Space Flight Center satellite programs.

  17. Developing Cognitive Models for Social Simulation from Survey Data

    NASA Astrophysics Data System (ADS)

    Alt, Jonathan K.; Lieberman, Stephen

    The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.

  18. Advanced Space Shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1982-01-01

    A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.

  19. Beware of external validation! - A Comparative Study of Several Validation Techniques used in QSAR Modelling.

    PubMed

    Majumdar, Subhabrata; Basak, Subhash C

    2018-04-26

    Proper validation is an important aspect of QSAR modelling. External validation is one of the widely used validation methods in QSAR where the model is built on a subset of the data and validated on the rest of the samples. However, its effectiveness for datasets with a small number of samples but large number of predictors remains suspect. Calculating hundreds or thousands of molecular descriptors using currently available software has become the norm in QSAR research, owing to computational advances in the past few decades. Thus, for n chemical compounds and p descriptors calculated for each molecule, the typical chemometric dataset today has high value of p but small n (i.e. n < p). Motivated by the evidence of inadequacies of external validation in estimating the true predictive capability of a statistical model in recent literature, this paper performs an extensive and comparative study of this method with several other validation techniques. We compared four validation methods: leave-one-out, K-fold, external and multi-split validation, using statistical models built using the LASSO regression, which simultaneously performs variable selection and modelling. We used 300 simulated datasets and one real dataset of 95 congeneric amine mutagens for this evaluation. External validation metrics have high variation among different random splits of the data, hence are not recommended for predictive QSAR models. LOO has the overall best performance among all validation methods applied in our scenario. Results from external validation are too unstable for the datasets we analyzed. Based on our findings, we recommend using the LOO procedure for validating QSAR predictive models built on high-dimensional small-sample data. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. Continued Development and Validation of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2015-11-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.

  1. Validation of the ArthroS virtual reality simulator for arthroscopic skills.

    PubMed

    Stunt, J J; Kerkhoffs, G M M J; van Dijk, C N; Tuijthof, G J M

    2015-11-01

    Virtual reality simulator training has become important for acquiring arthroscopic skills. A new simulator for knee arthroscopy ArthroS™ has been developed. The purpose of this study was to demonstrate face and construct validity, executed according to a protocol used previously to validate arthroscopic simulators. Twenty-seven participants were divided into three groups having different levels of arthroscopic experience. Participants answered questions regarding general information and the outer appearance of the simulator for face validity. Construct validity was assessed with one standardized navigation task. Face validity, educational value and user friendliness were further determined by giving participants three exercises and by asking them to fill out the questionnaire. Construct validity was demonstrated between experts and beginners. Median task times were not significantly different for all repetitions between novices and intermediates, and between intermediates and experts. Median face validity was 8.3 for the outer appearance, 6.5 for the intra-articular joint and 4.7 for surgical instruments. Educational value and user friendliness were perceived as nonsatisfactory, especially because of the lack of tactile feedback. The ArthroS™ demonstrated construct validity between novices and experts, but did not demonstrate full face validity. Future improvements should be mainly focused on the development of tactile feedback. It is necessary that a newly presented simulator is validated to prove it actually contributes to proficiency of skills.

  2. Three phase heat and mass transfer model for unsaturated soil freezing process: Part 2 - model validation

    NASA Astrophysics Data System (ADS)

    Zhang, Yaning; Xu, Fei; Li, Bingxi; Kim, Yong-Song; Zhao, Wenke; Xie, Gongnan; Fu, Zhongbin

    2018-04-01

    This study aims to validate the three-phase heat and mass transfer model developed in the first part (Three phase heat and mass transfer model for unsaturated soil freezing process: Part 1 - model development). Experimental results from studies and experiments were used for the validation. The results showed that the correlation coefficients for the simulated and experimental water contents at different soil depths were between 0.83 and 0.92. The correlation coefficients for the simulated and experimental liquid water contents at different soil temperatures were between 0.95 and 0.99. With these high accuracies, the developed model can be well used to predict the water contents at different soil depths and temperatures.

  3. Improvement of mathematical models for simulation of vehicle handling : volume 7 : technical manual for the general simulation

    DOT National Transportation Integrated Search

    1980-03-01

    This volume is the technical manual for the general simulation. Mathematical modelling of the vehicle and of the human driver is presented in detail, as are differences between the APL simulation and the current one. Information on model validation a...

  4. Validation of a novel laparoscopic adjustable gastric band simulator.

    PubMed

    Sankaranarayanan, Ganesh; Adair, James D; Halic, Tansel; Gromski, Mark A; Lu, Zhonghua; Ahn, Woojin; Jones, Daniel B; De, Suvranu

    2011-04-01

    Morbid obesity accounts for more than 90,000 deaths per year in the United States. Laparoscopic adjustable gastric banding (LAGB) is the second most common weight loss procedure performed in the US and the most common in Europe and Australia. Simulation in surgical training is a rapidly advancing field that has been adopted by many to prepare surgeons for surgical techniques and procedures. The aim of our study was to determine face, construct, and content validity for a novel virtual reality laparoscopic adjustable gastric band simulator. Twenty-eight subjects were categorized into two groups (expert and novice), determined by their skill level in laparoscopic surgery. Experts consisted of subjects who had at least 4 years of laparoscopic training and operative experience. Novices consisted of subjects with medical training but with less than 4 years of laparoscopic training. The subjects used the virtual reality laparoscopic adjustable band surgery simulator. They were automatically scored according to various tasks. The subjects then completed a questionnaire to evaluate face and content validity. On a 5-point Likert scale (1 = lowest score, 5 = highest score), the mean score for visual realism was 4.00 ± 0.67 and the mean score for realism of the interface and tool movements was 4.07 ± 0.77 (face validity). There were significant differences in the performances of the two subject groups (expert and novice) based on total scores (p < 0.001) (construct validity). Mean score for utility of the simulator, as addressed by the expert group, was 4.50 ± 0.71 (content validity). We created a virtual reality laparoscopic adjustable gastric band simulator. Our initial results demonstrate excellent face, construct, and content validity findings. To our knowledge, this is the first virtual reality simulator with haptic feedback for training residents and surgeons in the laparoscopic adjustable gastric banding procedure.

  5. Modeling and Simulation Verification, Validation and Accreditation (VV&A): A New Undertaking for the Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Prill, Mark E.

    2005-01-01

    and Accreditation (VV&A) session audience, a snapshot review of the Exploration Space Mission Directorate s (ESMD) investigation into implementation of a modeling and simulation (M&S) VV&A program. The presentation provides some legacy ESMD reference material, including information on the then-current organizational structure, and M&S (Simulation Based Acquisition (SBA)) focus contained therein, to provide a context for the proposed M&S VV&A approach. This reference material briefly highlights the SBA goals and objectives, and outlines FY05 M&S development and implementation consistent with the Subjective Assessment, Constructive Assessment, Operator-in-the-Loop Assessment, Hardware-in-the-Loop Assessment, and In Service Operations Assessment M&S construct, the NASA Exploration Information Ontology Model (NExIOM) data model, and integration with the Windchill-based Integrated Collaborative Environment (ICE). The presentation then addresses the ESMD team s initial conclusions regarding an M&S VV&A program, summarizes the general VV&A implementation approach anticipated, and outlines some of the recognized VV&A program challenges, all within a broader context of the overarching Integrated Modeling and Simulation (IM&S) environment at both the ESMD and Agency (NASA) levels. The presentation concludes with a status on the current M&S organization s progress to date relative to the recommended IM&S implementation activity. The overall presentation was focused to provide, for the Verification, Validation,

  6. A method for simulating sediment incipient motion varying with time and space in an ocean model (FVCOM): development and validation

    NASA Astrophysics Data System (ADS)

    Zhu, Zichen; Wang, Yongzhi; Bian, Shuhua; Hu, Zejian; Liu, Jianqiang; Liu, Lejun

    2017-11-01

    We modified the sediment incipient motion in a numerical model and evaluated the impact of this modification using a study case of the coastal area around Weihai, China. The modified and unmodified versions of the model were validated by comparing simulated and observed data of currents, waves, and suspended sediment concentrations (SSC) measured from July 25th to July 26th, 2006. A fitted Shields diagram was introduced into the sediment model so that the critical erosional shear stress could vary with time. Thus, the simulated SSC patterns were improved to more closely reflect the observed values, so that the relative error of the variation range decreased by up to 34.5% and the relative error of simulated temporally averaged SSC decreased by up to 36%. In the modified model, the critical shear stress values of the simulated silt with a diameter of 0.035 mm and mud with a diameter of 0.004 mm varied from 0.05 to 0.13 N/m2, and from 0.05 to 0.14 N/m 2, respectively, instead of remaining constant in the unmodified model. Besides, a method of applying spatially varying fractions of the mixed grain size sediment improved the simulated SSC distribution to fit better to the remote sensing map and reproduced the zonal area with high SSC between Heini Bay and the erosion groove in the modified model. The Relative Mean Absolute Error was reduced by between 6% and 79%, depending on the regional attributes when we used the modified method to simulate incipient sediment motion. But the modification achieved the higher accuracy in this study at a cost of computation speed decreasing by 1.52%.

  7. Validation of an Integrated Airframe and Turbofan Engine Simulation for Evaluation of Propulsion Control Modes

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Sowers, T Shane; Liu, Yuan; Owen, A. Karl; Guo, Ten-Huei

    2015-01-01

    The National Aeronautics and Space Administration (NASA) has developed independent airframe and engine models that have been integrated into a single real-time aircraft simulation for piloted evaluation of propulsion control algorithms. In order to have confidence in the results of these evaluations, the integrated simulation must be validated to demonstrate that its behavior is realistic and that it meets the appropriate Federal Aviation Administration (FAA) certification requirements for aircraft. The paper describes the test procedures and results, demonstrating that the integrated simulation generally meets the FAA requirements and is thus a valid testbed for evaluation of propulsion control modes.

  8. V-SUIT Model Validation Using PLSS 1.0 Test Results

    NASA Technical Reports Server (NTRS)

    Olthoff, Claas

    2015-01-01

    The dynamic portable life support system (PLSS) simulation software Virtual Space Suit (V-SUIT) has been under development at the Technische Universitat Munchen since 2011 as a spin-off from the Virtual Habitat (V-HAB) project. The MATLAB(trademark)-based V-SUIT simulates space suit portable life support systems and their interaction with a detailed and also dynamic human model, as well as the dynamic external environment of a space suit moving on a planetary surface. To demonstrate the feasibility of a large, system level simulation like V-SUIT, a model of NASA's PLSS 1.0 prototype was created. This prototype was run through an extensive series of tests in 2011. Since the test setup was heavily instrumented, it produced a wealth of data making it ideal for model validation. The implemented model includes all components of the PLSS in both the ventilation and thermal loops. The major components are modeled in greater detail, while smaller and ancillary components are low fidelity black box models. The major components include the Rapid Cycle Amine (RCA) CO2 removal system, the Primary and Secondary Oxygen Assembly (POS/SOA), the Pressure Garment System Volume Simulator (PGSVS), the Human Metabolic Simulator (HMS), the heat exchanger between the ventilation and thermal loops, the Space Suit Water Membrane Evaporator (SWME) and finally the Liquid Cooling Garment Simulator (LCGS). Using the created model, dynamic simulations were performed using same test points also used during PLSS 1.0 testing. The results of the simulation were then compared to the test data with special focus on absolute values during the steady state phases and dynamic behavior during the transition between test points. Quantified simulation results are presented that demonstrate which areas of the V-SUIT model are in need of further refinement and those that are sufficiently close to the test results. Finally, lessons learned from the modelling and validation process are given in combination

  9. Applicability of Monte Carlo cross validation technique for model development and validation using generalised least squares regression

    NASA Astrophysics Data System (ADS)

    Haddad, Khaled; Rahman, Ataur; A Zaman, Mohammad; Shrestha, Surendra

    2013-03-01

    SummaryIn regional hydrologic regression analysis, model selection and validation are regarded as important steps. Here, the model selection is usually based on some measurements of goodness-of-fit between the model prediction and observed data. In Regional Flood Frequency Analysis (RFFA), leave-one-out (LOO) validation or a fixed percentage leave out validation (e.g., 10%) is commonly adopted to assess the predictive ability of regression-based prediction equations. This paper develops a Monte Carlo Cross Validation (MCCV) technique (which has widely been adopted in Chemometrics and Econometrics) in RFFA using Generalised Least Squares Regression (GLSR) and compares it with the most commonly adopted LOO validation approach. The study uses simulated and regional flood data from the state of New South Wales in Australia. It is found that when developing hydrologic regression models, application of the MCCV is likely to result in a more parsimonious model than the LOO. It has also been found that the MCCV can provide a more realistic estimate of a model's predictive ability when compared with the LOO.

  10. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2017-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. An extended MHD model has shown good agreement with experimental data at 14 kHz injector operation. Efforts to extend the existing validation to a range of higher frequencies (36, 53, 68 kHz) using the PSI-Tet 3D extended MHD code will be presented, along with simulations of potential combinations of flux conserver features and helicity injector configurations and their impact on current drive performance, density control, and temperature for future SIHI experiments. Work supported by USDoE.

  11. Creation and Validation of a Novel Mobile Simulation Laboratory for High Fidelity, Prehospital, Difficult Airway Simulation.

    PubMed

    Bischof, Jason J; Panchal, Ashish R; Finnegan, Geoffrey I; Terndrup, Thomas E

    2016-10-01

    Introduction Endotracheal intubation (ETI) is a complex clinical skill complicated by the inherent challenge of providing care in the prehospital setting. Literature reports a low success rate of prehospital ETI attempts, partly due to the care environment and partly to the lack of consistent standardized training opportunities of prehospital providers in ETI. Hypothesis/Problem The availability of a mobile simulation laboratory (MSL) to study clinically critical interventions is needed in the prehospital setting to enhance instruction and maintain proficiency. This report is on the development and validation of a prehospital airway simulator and MSL that mimics in situ care provided in an ambulance. The MSL was a Type 3 ambulance with four cameras allowing audio-video recordings of observable behaviors. The prehospital airway simulator is a modified airway mannequin with increased static tongue pressure and a rigid cervical collar. Airway experts validated the model in a static setting through ETI at varying tongue pressures with a goal of a Grade 3 Cormack-Lehane (CL) laryngeal view. Following completion of this development, the MSL was launched with the prehospital airway simulator to distant communities utilizing a single facilitator/driver. Paramedics were recruited to perform ETI in the MSL, and the detailed airway management observations were stored for further analysis. Nineteen airway experts performed 57 ETI attempts at varying tongue pressures demonstrating increased CL views at higher tongue pressures. Tongue pressure of 60 mm Hg generated 31% Grade 3/4 CL view and was chosen for the prehospital trials. The MSL was launched and tested by 18 paramedics. First pass success was 33% with another 33% failing to intubate within three attempts. The MSL created was configured to deliver, record, and assess intubator behaviors with a difficult airway simulation. The MSL created a reproducible, high fidelity, mobile learning environment for assessment of

  12. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    NASA Astrophysics Data System (ADS)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  13. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  14. Integrated Modeling and Simulation Verification, Validation, and Accreditation Strategy for Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    2006-01-01

    Models and simulations (M&S) are critical resources in the exploration of space. They support program management, systems engineering, integration, analysis, test, and operations and provide critical information and data supporting key analyses and decisions (technical, cost and schedule). Consequently, there is a clear need to establish a solid understanding of M&S strengths and weaknesses, and the bounds within which they can credibly support decision-making. Their usage requires the implementation of a rigorous approach to verification, validation and accreditation (W&A) and establishment of formal process and practices associated with their application. To ensure decision-making is suitably supported by information (data, models, test beds) from activities (studies, exercises) from M&S applications that are understood and characterized, ESMD is establishing formal, tailored W&A processes and practices. In addition, to ensure the successful application of M&S within ESMD, a formal process for the certification of analysts that use M&S is being implemented. This presentation will highlight NASA's Exploration Systems Mission Directorate (ESMD) management approach for M&S W&A to ensure decision-makers receive timely information on the model's fidelity, credibility, and quality.

  15. Validating a driving simulator using surrogate safety measures.

    PubMed

    Yan, Xuedong; Abdel-Aty, Mohamed; Radwan, Essam; Wang, Xuesong; Chilakapati, Praveen

    2008-01-01

    Traffic crash statistics and previous research have shown an increased risk of traffic crashes at signalized intersections. How to diagnose safety problems and develop effective countermeasures to reduce crash rate at intersections is a key task for traffic engineers and researchers. This study aims at investigating whether the driving simulator can be used as a valid tool to assess traffic safety at signalized intersections. In support of the research objective, this simulator validity study was conducted from two perspectives, a traffic parameter (speed) and a safety parameter (crash history). A signalized intersection with as many important features (including roadway geometries, traffic control devices, intersection surroundings, and buildings) was replicated into a high-fidelity driving simulator. A driving simulator experiment with eight scenarios at the intersection were conducted to determine if the subjects' speed behavior and traffic risk patterns in the driving simulator were similar to what were found at the real intersection. The experiment results showed that speed data observed from the field and in the simulator experiment both follow normal distributions and have equal means for each intersection approach, which validated the driving simulator in absolute terms. Furthermore, this study used an innovative approach of using surrogate safety measures from the simulator to contrast with the crash analysis for the field data. The simulator experiment results indicated that compared to the right-turn lane with the low rear-end crash history record (2 crashes), subjects showed a series of more risky behaviors at the right-turn lane with the high rear-end crash history record (16 crashes), including higher deceleration rate (1.80+/-1.20 m/s(2) versus 0.80+/-0.65 m/s(2)), higher non-stop right-turn rate on red (81.67% versus 57.63%), higher right-turn speed as stop line (18.38+/-8.90 km/h versus 14.68+/-6.04 km/h), shorter following distance (30

  16. OR fire virtual training simulator: design and face validity.

    PubMed

    Dorozhkin, Denis; Olasky, Jaisa; Jones, Daniel B; Schwaitzberg, Steven D; Jones, Stephanie B; Cao, Caroline G L; Molina, Marcos; Henriques, Steven; Wang, Jinling; Flinn, Jeff; De, Suvranu

    2017-09-01

    The Virtual Electrosurgical Skill Trainer is a tool for training surgeons the safe operation of electrosurgery tools in both open and minimally invasive surgery. This training includes a dedicated team-training module that focuses on operating room (OR) fire prevention and response. The module was developed to allow trainees, practicing surgeons, anesthesiologist, and nurses to interact with a virtual OR environment, which includes anesthesia apparatus, electrosurgical equipment, a virtual patient, and a fire extinguisher. Wearing a head-mounted display, participants must correctly identify the "fire triangle" elements and then successfully contain an OR fire. Within these virtual reality scenarios, trainees learn to react appropriately to the simulated emergency. A study targeted at establishing the face validity of the virtual OR fire simulator was undertaken at the 2015 Society of American Gastrointestinal and Endoscopic Surgeons conference. Forty-nine subjects with varying experience participated in this Institutional Review Board-approved study. The subjects were asked to complete the OR fire training/prevention sequence in the VEST simulator. Subjects were then asked to answer a subjective preference questionnaire consisting of sixteen questions, focused on the usefulness and fidelity of the simulator. On a 5-point scale, 12 of 13 questions were rated at a mean of 3 or greater (92%). Five questions were rated above 4 (38%), particularly those focusing on the simulator effectiveness and its usefulness in OR fire safety training. A total of 33 of the 49 participants (67%) chose the virtual OR fire trainer over the traditional training methods such as a textbook or an animal model. Training for OR fire emergencies in fully immersive VR environments, such as the VEST trainer, may be the ideal training modality. The face validity of the OR fire training module of the VEST simulator was successfully established on many aspects of the simulation.

  17. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  18. U.S. 75 Dallas, Texas, Model Validation and Calibration Report

    DOT National Transportation Integrated Search

    2010-02-01

    This report presents the model validation and calibration results of the Integrated Corridor Management (ICM) analysis, modeling, and simulation (AMS) for the U.S. 75 Corridor in Dallas, Texas. The purpose of the project was to estimate the benefits ...

  19. [Comparison between the Range of Movement Canine Real Cervical Spine and Numerical Simulation - Computer Model Validation].

    PubMed

    Srnec, R; Horák, Z; Sedláček, R; Sedlinská, M; Krbec, M; Nečas, A

    2017-01-01

    PURPOSE OF THE STUDY In developing new or modifying the existing surgical treatment methods of spine conditions an integral part of ex vivo experiments is the assessment of mechanical, kinematic and dynamic properties of created constructions. The aim of the study is to create an appropriately validated numerical model of canine cervical spine in order to obtain a tool for basic research to be applied in cervical spine surgeries. For this purpose, canine is a suitable model due to the occurrence of similar cervical spine conditions in some breeds of dogs and in humans. The obtained model can also be used in research and in clinical veterinary practice. MATERIAL AND METHODS In order to create a 3D spine model, the LightSpeed 16 (GE, Milwaukee, USA) multidetector computed tomography was used to scan the cervical spine of Doberman Pinscher. The data were transmitted to Mimics 12 software (Materialise HQ, Belgium), in which the individual vertebrae were segmented on CT scans by thresholding. The vertebral geometry was exported to Rhinoceros software (McNeel North America, USA) for modelling, and subsequently the specialised software Abaqus (Dassault Systemes, France) was used to analyse the response of the physiological spine model to external load by the finite element method (FEM). All the FEM based numerical simulations were considered as nonlinear contact statistic tasks. In FEM analyses, angles between individual spinal segments were monitored in dependence on ventroflexion/ /dorziflexion. The data were validated using the latero-lateral radiographs of cervical spine of large breed dogs with no evident clinical signs of cervical spine conditions. The radiographs within the cervical spine range of motion were taken at three different positions: in neutral position, in maximal ventroflexion and in maximal dorziflexion. On X-rays, vertebral inclination angles in monitored spine positions were measured and compared with the results obtain0ed from FEM analyses of the

  20. Validation of landsurface processes in the AMIP models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, T J

    The Atmospheric Model Intercomparison Project (AMIP) is a commonly accepted protocol for testing the performance of the world's atmospheric general circulation models (AGCMs) under common specifications of radiative forcings (in solar constant and carbon dioxide concentration) and observed ocean boundary conditions (Gates 1992, Gates et al. 1999). From the standpoint of landsurface specialists, the AMIP affords an opportunity to investigate the behaviors of a wide variety of land-surface schemes (LSS) that are coupled to their ''native'' AGCMs (Phillips et al. 1995, Phillips 1999). In principle, therefore, the AMIP permits consideration of an overarching question: ''To what extent does an AGCM'smore » performance in simulating continental climate depend on the representations of land-surface processes by the embedded LSS?'' There are, of course, some formidable obstacles to satisfactorily addressing this question. First, there is the dilemna of how to effectively validate simulation performance, given the present dearth of global land-surface data sets. Even if this data problem were to be alleviated, some inherent methodological difficulties would remain: in the context of the AMIP, it is not possible to validate a given LSS per se, since the associated land-surface climate simulation is a product of the coupled AGCM/LSS system. Moreover, aside from the intrinsic differences in LSS across the AMIP models, the varied representations of land-surface characteristics (e.g. vegetation properties, surface albedos and roughnesses, etc.) and related variations in land-surface forcings further complicate such an attribution process. Nevertheless, it may be possible to develop validation methodologies/statistics that are sufficiently penetrating to reveal ''signatures'' of particular ISS representations (e.g. ''bucket'' vs more complex parameterizations of hydrology) in the AMIP land-surface simulations.« less

  1. Creation and validation of a simulator for corneal rust ring removal.

    PubMed

    Mednick, Zale; Tabanfar, Reza; Alexander, Ashley; Simpson, Sarah; Baxter, Stephanie

    2017-10-01

    To create and validate a simulation model for corneal rust ring removal. Rust rings were created on cadaveric eyes with the use of small particles of metal. The eyes were mounted on suction plates at slit lamps and the trainees practiced rust ring removal. An inexperienced cohort of medical students and first year ophthalmology residents (n=11), and an experienced cohort of senior residents and faculty (n=11) removed the rust rings from the eyes with the use of a burr. Rust ring removal was evaluated based on removal time, percentage of rust removed and incidence of corneal perforation. A survey was administered to participants to determine face validity. Time for rust ring removal was longer in the inexperienced group at 187±93 seconds (range of 66-408 seconds), compared to the experienced group at 117±54 seconds (range of 55-240 seconds) (p=0.046). Removal speed was similar between groups, at 4847±4355 pixels/minute and 7206±5181 pixels/minute in the inexperienced and experienced groups, respectively (p=0.26). Removal percentage values were similar between groups, at 61±15% and 69±18% (p=0.38). There were no corneal perforations. 100% (22/22) of survey respondents believed the simulator would be a valuable practice tool, and 89% (17/19) felt the simulation was a valid representation of the clinical correlate. The corneal rust ring simulator presented here is a valid training tool that could be used by early trainees to gain greater comfort level before attempting rust ring removal on a live patient. Copyright © 2017 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.

  2. Space shuttle simulation model

    NASA Technical Reports Server (NTRS)

    Tatom, F. B.; Smith, S. R.

    1980-01-01

    The effects of atmospheric turbulence in both horizontal and near horizontal flight, during the return of the space shuttle, are important for determining design, control, and 'pilot-in-the-loop' effects. A nonrecursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed which provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gust gradients. Based on this model, the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes which are entitled shuttle simulation turbulence tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 10,000 meters. The turbulence generation procedure is described as well as the results of validating the simulated turbulence. Conclusions and recommendations are presented and references cited. The tabulated one dimensional von Karman spectra and the results of spectral and statistical analyses of the SSTT are contained in the appendix.

  3. Development of full regeneration establishment models for the forest vegetation simulator

    Treesearch

    John D. Shaw

    2015-01-01

    For most simulation modeling efforts, the goal of model developers is to produce simulations that are the best representations of realism as possible. Achieving this goal commonly requires a considerable amount of data to set the initial parameters, followed by validation and model improvement – both of which require even more data. The Forest Vegetation Simulator (FVS...

  4. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  5. Validation of High-Fidelity CFD/CAA Framework for Launch Vehicle Acoustic Environment Simulation against Scale Model Test Data

    NASA Technical Reports Server (NTRS)

    Liever, Peter A.; West, Jeffrey S.

    2016-01-01

    A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.

  6. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    ERIC Educational Resources Information Center

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  7. Initial validation of a virtual-reality robotic simulator.

    PubMed

    Lendvay, Thomas S; Casale, Pasquale; Sweet, Robert; Peters, Craig

    2008-09-01

    Robotic surgery is an accepted adjunct to minimally invasive surgery, but training is restricted to console time. Virtual-reality (VR) simulation has been shown to be effective for laparoscopic training and so we seek to validate a novel VR robotic simulator. The American Urological Association (AUA) Office of Education approved this study. Subjects enrolled in a robotics training course at the 2007 AUA annual meeting underwent skills training in a da Vinci dry-lab module and a virtual-reality robotics module which included a three-dimensional (3D) VR robotic simulator. Demographic and acceptability data were obtained, and performance metrics from the simulator were compared between experienced and nonexperienced roboticists for a ring transfer task. Fifteen subjects-four with previous robotic surgery experience and 11 without-participated. Nine subjects were still in urology training and nearly half of the group had reported playing video games. Overall performance of the da Vinci system and the simulator were deemed acceptable by a Likert scale (0-6) rating of 5.23 versus 4.69, respectively. Experienced subjects outperformed nonexperienced subjects on the simulator on three metrics: total task time (96 s versus 159 s, P < 0.02), economy of motion (1,301 mm versus 2,095 mm, P < 0.04), and time the telemanipulators spent outside of the center of the platform's workspace (4 s versus 35 s, P < 0.02). This is the first demonstration of face and construct validity of a virtual-reality robotic simulator. Further studies assessing predictive validity are ultimately required to support incorporation of VR robotic simulation into training curricula.

  8. Use of 3-dimensional printing technology and silicone modeling in surgical simulation: development and face validation in pediatric laparoscopic pyeloplasty.

    PubMed

    Cheung, Carling L; Looi, Thomas; Lendvay, Thomas S; Drake, James M; Farhat, Walid A

    2014-01-01

    Pediatric laparoscopy poses unique training challenges owing to smaller workspaces, finer sutures used, and potentially more delicate tissues that require increased surgical dexterity when compared with adult analogs. We describe the development and face validation of a pediatric pyeloplasty simulator using a low-cost laparoscopic dry-laboratory model developed with 3-dimensional (3D) printing and silicone modeling. The organs (the kidney, renal pelvis, and ureter) were created in a 3-step process where molds were created with 3D modeling software, printed with a Spectrum Z510 3D printer, and cast with Dragon Skin 30 silicone rubber. The model was secured in a laparoscopy box trainer. A pilot study was conducted at a Canadian Urological Association meeting. A total of 24 pediatric urology fellows and 3 experienced faculty members then assessed our skills module during a minimally invasive surgery training course. Participants had 60 minutes to perform a right-side pyeloplasty using laparoscopic tools and 5-0 VICRYL suture. Face validity was demonstrated on a 5-point Likert scale. The dry-laboratory model consists of a kidney, a replaceable dilated renal pelvis and ureter with an obstructed ureteropelvic junction, and an overlying peritoneum with an inscribed fundamentals of laparoscopic surgery pattern-cutting exercise. During initial validation at the Canadian Urological Association, participants rated (out of 5) 4.75 ± 0.29 for overall impression, 4.50 ± 0.41 for realism, and 4.38 ± 0.48 for handling. During the minimally invasive surgery course, 22 of 24 fellows and all the faculty members completed the scoring. Usability was rated 4 or 5 by 14 participants (overall, 3.6 ± 1.22 by novices and 3.7 ± 0.58 by experts), indicating that they would use the model in their own training and teaching. Esthetically, the model was rated 3.5 ± 0.74 (novices) and 3.3 ± 0.58 (experts). We developed a pediatric pyeloplasty simulator by applying a low-cost reusable model

  9. Validation of a Novel Laparoscopic Adjustable Gastric Band Simulator

    PubMed Central

    Sankaranarayanan, Ganesh; Adair, James D.; Halic, Tansel; Gromski, Mark A.; Lu, Zhonghua; Ahn, Woojin; Jones, Daniel B.; De, Suvranu

    2011-01-01

    Background Morbid obesity accounts for more than 90,000 deaths per year in the United States. Laparoscopic adjustable gastric banding (LAGB) is the second most common weight loss procedure performed in the US and the most common in Europe and Australia. Simulation in surgical training is a rapidly advancing field that has been adopted by many to prepare surgeons for surgical techniques and procedures. Study Aim The aim of our study was to determine face, construct and content validity for a novel virtual reality laparoscopic adjustable gastric band simulator. Methods Twenty-eight subjects were categorized into two groups (Expert and Novice), determined by their skill level in laparoscopic surgery. Experts consisted of subjects who had at least four years of laparoscopic training and operative experience. Novices consisted of subjects with medical training, but with less than four years of laparoscopic training. The subjects performed the virtual reality laparoscopic adjustable band surgery simulator. They were automatically scored, according to various tasks. The subjects then completed a questionnaire to evaluate face and content validity. Results On a 5-point Likert scale (1 – lowest score, 5 – highest score), the mean score for visual realism was 4.00 ± 0.67 and the mean score for realism of the interface and tool movements was 4.07 ± 0.77 [Face Validity]. There were significant differences in the performance of the two subject groups (Expert and Novice), based on total scores (p<0.001) [Construct Validity]. Mean scores for utility of the simulator, as addressed by the Expert group, was 4.50 ± 0.71 [Content Validity]. Conclusion We created a virtual reality laparoscopic adjustable gastric band simulator. Our initial results demonstrate excellent face, construct and content validity findings. To our knowledge, this is the first virtual reality simulator with haptic feedback for training residents and surgeons in the laparoscopic adjustable gastric banding

  10. Groundwater Model Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process ofmore » stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of

  11. Coupling of geochemical and multiphase flow processes for validation of the MUFITS reservoir simulator against TOUGHREACT

    NASA Astrophysics Data System (ADS)

    De Lucia, Marco; Kempka, Thomas; Afanasyev, Andrey; Melnik, Oleg; Kühn, Michael

    2016-04-01

    Coupled reactive transport simulations, especially in heterogeneous settings considering multiphase flow, are extremely time consuming and suffer from significant numerical issues compared to purely hydrodynamic simulations. This represents a major hurdle in the assessment of geological subsurface utilization, since it constrains the practical application of reactive transport modelling to coarse spatial discretization or oversimplified geological settings. In order to overcome such limitations, De Lucia et al. [1] developed and validated a one-way coupling approach between geochemistry and hydrodynamics, which is particularly well suited for CO2 storage simulations, while being of general validity. In the present study, the models used for the validation of the one-way coupling approach introduced by De Lucia et al. (2015), and originally performed with the TOUGHREACT simulator, are transferred to and benchmarked against the multiphase reservoir simulator MUFITS [2]. The geological model is loosely inspired by an existing CO2 storage site. Its grid comprises 2,950 elements enclosed in a single layer, but reflecting a realistic three-dimensional anticline geometry. For the purpose of this comparison, homogeneous and heterogeneous scenarios in terms of porosity and permeability were investigated. In both cases, the results of the MUFITS simulator are in excellent agreement with those produced with the fully-coupled TOUGHREACT simulator, while profiting from significantly higher computational performance. This study demonstrates how a computationally efficient simulator such as MUFITS can be successfully included in a coupled process simulation framework, and also suggests ameliorations and specific strategies for the coupling of chemical processes with hydrodynamics and heat transport, aiming at tackling geoscientific problems beyond the storage of CO2. References [1] De Lucia, M., Kempka, T., and Kühn, M. A coupling alternative to reactive transport simulations

  12. Development and validation of the simulation-based learning evaluation scale.

    PubMed

    Hung, Chang-Chiao; Liu, Hsiu-Chen; Lin, Chun-Chih; Lee, Bih-O

    2016-05-01

    The instruments that evaluate a student's perception of receiving simulated training are English versions and have not been tested for reliability or validity. The aim of this study was to develop and validate a Chinese version Simulation-Based Learning Evaluation Scale (SBLES). Four stages were conducted to develop and validate the SBLES. First, specific desired competencies were identified according to the National League for Nursing and Taiwan Nursing Accreditation Council core competencies. Next, the initial item pool was comprised of 50 items related to simulation that were drawn from the literature of core competencies. Content validity was established by use of an expert panel. Finally, exploratory factor analysis and confirmatory factor analysis were conducted for construct validity, and Cronbach's coefficient alpha determined the scale's internal consistency reliability. Two hundred and fifty students who had experienced simulation-based learning were invited to participate in this study. Two hundred and twenty-five students completed and returned questionnaires (response rate=90%). Six items were deleted from the initial item pool and one was added after an expert panel review. Exploratory factor analysis with varimax rotation revealed 37 items remaining in five factors which accounted for 67% of the variance. The construct validity of SBLES was substantiated in a confirmatory factor analysis that revealed a good fit of the hypothesized factor structure. The findings tally with the criterion of convergent and discriminant validity. The range of internal consistency for five subscales was .90 to .93. Items were rated on a 5-point scale from 1 (strongly disagree) to 5 (strongly agree). The results of this study indicate that the SBLES is valid and reliable. The authors recommend that the scale could be applied in the nursing school to evaluate the effectiveness of simulation-based learning curricula. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Mathematical Capture of Human Crowd Behavioral Data for Computational Model Building, Verification, and Validation

    DTIC Science & Technology

    2011-03-21

    throughout the experimental runs. Reliable and validated measures of anxiety ( Spielberger , 1983), as well as custom-constructed questionnaires about...Crowd modeling and simulation technologies. Transactions on modeling and computer simulation, 20(4). Spielberger , C. D. (1983

  14. On the validation of cloud parametrization schemes in numerical atmospheric models with satellite data from ISCCP

    NASA Astrophysics Data System (ADS)

    Meinke, I.

    2003-04-01

    A new method is presented to validate cloud parametrization schemes in numerical atmospheric models with satellite data of scanning radiometers. This method is applied to the regional atmospheric model HRM (High Resolution Regional Model) using satellite data from ISCCP (International Satellite Cloud Climatology Project). Due to the limited reliability of former validations there has been a need for developing a new validation method: Up to now differences between simulated and measured cloud properties are mostly declared as deficiencies of the cloud parametrization scheme without further investigation. Other uncertainties connected with the model or with the measurements have not been taken into account. Therefore changes in the cloud parametrization scheme based on such kind of validations might not be realistic. The new method estimates uncertainties of the model and the measurements. Criteria for comparisons of simulated and measured data are derived to localize deficiencies in the model. For a better specification of these deficiencies simulated clouds are classified regarding their parametrization. With this classification the localized model deficiencies are allocated to a certain parametrization scheme. Applying this method to the regional model HRM the quality of forecasting cloud properties is estimated in detail. The overestimation of simulated clouds in low emissivity heights especially during the night is localized as model deficiency. This is caused by subscale cloudiness. As the simulation of subscale clouds in the regional model HRM is described by a relative humidity parametrization these deficiencies are connected with this parameterization.

  15. An ice sheet model validation framework for the Greenland ice sheet

    NASA Astrophysics Data System (ADS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of < 1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on

  16. An ice sheet model validation framework for the Greenland ice sheet

    PubMed Central

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2018-01-01

    We propose a new ice sheet model validation framework – the Cryospheric Model Comparison Tool (CmCt) – that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the

  17. An ice sheet model validation framework for the Greenland ice sheet.

    PubMed

    Price, Stephen F; Hoffman, Matthew J; Bonin, Jennifer A; Howat, Ian M; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P; Evans, Katherine J; Kennedy, Joseph H; Lenaerts, Jan; Lipscomb, William H; Perego, Mauro; Salinger, Andrew G; Tuminaro, Raymond S; van den Broeke, Michiel R; Nowicki, Sophie M J

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past

  18. An Ice Sheet Model Validation Framework for the Greenland Ice Sheet

    NASA Technical Reports Server (NTRS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas A.; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey R.; Chambers, Don P.; Evans, Katherine J.; hide

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of less than 1 meter). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred

  19. Large Eddy Simulation Modeling of Flashback and Flame Stabilization in Hydrogen-Rich Gas Turbines Using a Hierarchical Validation Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clemens, Noel

    This project was a combined computational and experimental effort to improve predictive capability for boundary layer flashback of premixed swirl flames relevant to gas-turbine power plants operating with high-hydrogen-content fuels. During the course of this project, significant progress in modeling was made on four major fronts: 1) use of direct numerical simulation of turbulent flames to understand the coupling between the flame and the turbulent boundary layer; 2) improved modeling capability for flame propagation in stratified pre-mixtures; 3) improved portability of computer codes using the OpenFOAM platform to facilitate transfer to industry and other researchers; and 4) application of LESmore » to flashback in swirl combustors, and a detailed assessment of its capabilities and limitations for predictive purposes. A major component of the project was an experimental program that focused on developing a rich experimental database of boundary layer flashback in swirl flames. Both methane and high-hydrogen fuels, including effects of elevated pressure (1 to 5 atm), were explored. For this project, a new model swirl combustor was developed. Kilohertz-rate stereoscopic PIV and chemiluminescence imaging were used to investigate the flame propagation dynamics. In addition to the planar measurements, a technique capable of detecting the instantaneous, time-resolved 3D flame front topography was developed and applied successfully to investigate the flow-flame interaction. The UT measurements and legacy data were used in a hierarchical validation approach where flows with increasingly complex physics were used for validation. First component models were validated with DNS and literature data in simplified configurations, and this was followed by validation with the UT 1-atm flashback cases, and then the UT high-pressure flashback cases. The new models and portable code represent a major improvement over what was available before this project was initiated.« less

  20. Integration and Validation of Hysteroscopy Simulation in the Surgical Training Curriculum.

    PubMed

    Elessawy, Mohamed; Skrzipczyk, Moritz; Eckmann-Scholz, Christel; Maass, Nicolai; Mettler, Liselotte; Guenther, Veronika; van Mackelenbergh, Marion; Bauerschlag, Dirk O; Alkatout, Ibrahim

    The primary objective of our study was to test the construct validity of the HystSim hysteroscopic simulator to determine whether simulation training can improve the acquisition of hysteroscopic skills regardless of the previous levels of experience of the participants. The secondary objective was to analyze the performance of a selected task, using specially designed scoring charts to help reduce the learning curve for both novices and experienced surgeons. The teaching of hysteroscopic intervention has received only scant attention, focusing mainly on the development of physical models and box simulators. This encouraged our working group to search for a suitable hysteroscopic simulator module and to test its validation. We decided to use the HystSim hysteroscopic simulator, which is one of the few such simulators that has already completed a validation process, with high ratings for both realism and training capacity. As a testing tool for our study, we selected the myoma resection task. We analyzed the results using the multimetric score system suggested by HystSim, allowing a more precise interpretation of the results. Between June 2014 and May 2015, our group collected data on 57 participants of minimally invasive surgical training courses at the Kiel School of Gynecological Endoscopy, Department of Gynecology and Obstetrics, University Hospitals Schleswig-Holstein, Campus Kiel. The novice group consisted of 42 medical students and residents with no prior experience in hysteroscopy, whereas the expert group consisted of 15 participants with more than 2 years of experience of advanced hysteroscopy operations. The overall results demonstrated that all participants attained significant improvements between their pretest and posttests, independent of their previous levels of experience (p < 0.002). Those in the expert group demonstrated statistically significant, superior scores in the pretest and posttests (p = 0.001, p = 0.006). Regarding visualization and

  1. Cross-validation to select Bayesian hierarchical models in phylogenetics.

    PubMed

    Duchêne, Sebastián; Duchêne, David A; Di Giallonardo, Francesca; Eden, John-Sebastian; Geoghegan, Jemma L; Holt, Kathryn E; Ho, Simon Y W; Holmes, Edward C

    2016-05-26

    Recent developments in Bayesian phylogenetic models have increased the range of inferences that can be drawn from molecular sequence data. Accordingly, model selection has become an important component of phylogenetic analysis. Methods of model selection generally consider the likelihood of the data under the model in question. In the context of Bayesian phylogenetics, the most common approach involves estimating the marginal likelihood, which is typically done by integrating the likelihood across model parameters, weighted by the prior. Although this method is accurate, it is sensitive to the presence of improper priors. We explored an alternative approach based on cross-validation that is widely used in evolutionary analysis. This involves comparing models according to their predictive performance. We analysed simulated data and a range of viral and bacterial data sets using a cross-validation approach to compare a variety of molecular clock and demographic models. Our results show that cross-validation can be effective in distinguishing between strict- and relaxed-clock models and in identifying demographic models that allow growth in population size over time. In most of our empirical data analyses, the model selected using cross-validation was able to match that selected using marginal-likelihood estimation. The accuracy of cross-validation appears to improve with longer sequence data, particularly when distinguishing between relaxed-clock models. Cross-validation is a useful method for Bayesian phylogenetic model selection. This method can be readily implemented even when considering complex models where selecting an appropriate prior for all parameters may be difficult.

  2. Simulation models in population breast cancer screening: A systematic review.

    PubMed

    Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H

    2015-08-01

    The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for qualitative assessment which incorporated model type; input parameters; modeling approach, transparency of input data sources/assumptions, sensitivity analyses and risk of bias; validation, and outcomes was developed. Predicted mortality reduction (MR) and cost-effectiveness (CE) were compared to estimates from meta-analyses of randomized control trials (RCTs) and acceptability thresholds. Seven original simulation models were distinguished, all sharing common input parameters. The modeling approach was based on tumor progression (except one model) with internal and cross validation of the resulting models, but without any external validation. Differences in lead times for invasive or non-invasive tumors, and the option for cancers not to progress were not explicitly modeled. The models tended to overestimate the MR (11-24%) due to screening as compared to optimal RCTs 10% (95% CI - 2-21%) MR. Only recently, potential harms due to regular breast cancer screening were reported. Most scenarios resulted in acceptable cost-effectiveness estimates given current thresholds. The selected models have been repeatedly applied in various settings to inform decision making and the critical analysis revealed high risk of bias in their outcomes. Given the importance of the models, there is a need for externally validated models which use systematical evidence for input data to allow for more critical evaluation of breast cancer screening. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Minimum-complexity helicopter simulation math model

    NASA Technical Reports Server (NTRS)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  4. Development and Validation of a Novel Robotic Procedure Specific Simulation Platform: Partial Nephrectomy.

    PubMed

    Hung, Andrew J; Shah, Swar H; Dalag, Leonard; Shin, Daniel; Gill, Inderbir S

    2015-08-01

    We developed a novel procedure specific simulation platform for robotic partial nephrectomy. In this study we prospectively evaluate its face, content, construct and concurrent validity. This hybrid platform features augmented reality and virtual reality. Augmented reality involves 3-dimensional robotic partial nephrectomy surgical videos overlaid with virtual instruments to teach surgical anatomy, technical skills and operative steps. Advanced technical skills are assessed with an embedded full virtual reality renorrhaphy task. Participants were classified as novice (no surgical training, 15), intermediate (less than 100 robotic cases, 13) or expert (100 or more robotic cases, 14) and prospectively assessed. Cohort performance was compared with the Kruskal-Wallis test (construct validity). Post-study questionnaire was used to assess the realism of simulation (face validity) and usefulness for training (content validity). Concurrent validity evaluated correlation between virtual reality renorrhaphy task and a live porcine robotic partial nephrectomy performance (Spearman's analysis). Experts rated the augmented reality content as realistic (median 8/10) and helpful for resident/fellow training (8.0-8.2/10). Experts rated the platform highly for teaching anatomy (9/10) and operative steps (8.5/10) but moderately for technical skills (7.5/10). Experts and intermediates outperformed novices (construct validity) in efficiency (p=0.0002) and accuracy (p=0.002). For virtual reality renorrhaphy, experts outperformed intermediates on GEARS metrics (p=0.002). Virtual reality renorrhaphy and in vivo porcine robotic partial nephrectomy performance correlated significantly (r=0.8, p <0.0001) (concurrent validity). This augmented reality simulation platform displayed face, content and construct validity. Performance in the procedure specific virtual reality task correlated highly with a porcine model (concurrent validity). Future efforts will integrate procedure specific

  5. Model Validation Status Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E.L. Hardin

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified,more » and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural

  6. Validity evidence and reliability of a simulated patient feedback instrument

    PubMed Central

    2012-01-01

    Background In the training of healthcare professionals, one of the advantages of communication training with simulated patients (SPs) is the SP's ability to provide direct feedback to students after a simulated clinical encounter. The quality of SP feedback must be monitored, especially because it is well known that feedback can have a profound effect on student performance. Due to the current lack of valid and reliable instruments to assess the quality of SP feedback, our study examined the validity and reliability of one potential instrument, the 'modified Quality of Simulated Patient Feedback Form' (mQSF). Methods Content validity of the mQSF was assessed by inviting experts in the area of simulated clinical encounters to rate the importance of the mQSF items. Moreover, generalizability theory was used to examine the reliability of the mQSF. Our data came from videotapes of clinical encounters between six simulated patients and six students and the ensuing feedback from the SPs to the students. Ten faculty members judged the SP feedback according to the items on the mQSF. Three weeks later, this procedure was repeated with the same faculty members and recordings. Results All but two items of the mQSF received importance ratings of > 2.5 on a four-point rating scale. A generalizability coefficient of 0.77 was established with two judges observing one encounter. Conclusions The findings for content validity and reliability with two judges suggest that the mQSF is a valid and reliable instrument to assess the quality of feedback provided by simulated patients. PMID:22284898

  7. Validity evidence and reliability of a simulated patient feedback instrument.

    PubMed

    Schlegel, Claudia; Woermann, Ulrich; Rethans, Jan-Joost; van der Vleuten, Cees

    2012-01-27

    In the training of healthcare professionals, one of the advantages of communication training with simulated patients (SPs) is the SP's ability to provide direct feedback to students after a simulated clinical encounter. The quality of SP feedback must be monitored, especially because it is well known that feedback can have a profound effect on student performance. Due to the current lack of valid and reliable instruments to assess the quality of SP feedback, our study examined the validity and reliability of one potential instrument, the 'modified Quality of Simulated Patient Feedback Form' (mQSF). Content validity of the mQSF was assessed by inviting experts in the area of simulated clinical encounters to rate the importance of the mQSF items. Moreover, generalizability theory was used to examine the reliability of the mQSF. Our data came from videotapes of clinical encounters between six simulated patients and six students and the ensuing feedback from the SPs to the students. Ten faculty members judged the SP feedback according to the items on the mQSF. Three weeks later, this procedure was repeated with the same faculty members and recordings. All but two items of the mQSF received importance ratings of > 2.5 on a four-point rating scale. A generalizability coefficient of 0.77 was established with two judges observing one encounter. The findings for content validity and reliability with two judges suggest that the mQSF is a valid and reliable instrument to assess the quality of feedback provided by simulated patients.

  8. Improvements and validation of the erythropoiesis control model for bed rest simulation

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1977-01-01

    The most significant improvement in the model is the explicit formulation of separate elements representing erythropoietin production and red cell production. Other modifications include bone marrow time-delays, capability to shift oxyhemoglobin affinity and an algorithm for entering experimental data as time-varying driving functions. An area of model development is suggested by applying the model to simulating onset, diagnosis and treatment of a hematologic disorder. Recommendations for further improvements in the model and suggestions for experimental application are also discussed. A detailed analysis of the hematologic response to bed rest including simulation of the recent Baylor Medical College bed rest studies is also presented.

  9. Coupling of Large Eddy Simulations with Meteorological Models to simulate Methane Leaks from Natural Gas Storage Facilities

    NASA Astrophysics Data System (ADS)

    Prasad, K.

    2017-12-01

    Atmospheric transport is usually performed with weather models, e.g., the Weather Research and Forecasting (WRF) model that employs a parameterized turbulence model and does not resolve the fine scale dynamics generated by the flow around buildings and features comprising a large city. The NIST Fire Dynamics Simulator (FDS) is a computational fluid dynamics model that utilizes large eddy simulation methods to model flow around buildings at length scales much smaller than is practical with models like WRF. FDS has the potential to evaluate the impact of complex topography on near-field dispersion and mixing that is difficult to simulate with a mesoscale atmospheric model. A methodology has been developed to couple the FDS model with WRF mesoscale transport models. The coupling is based on nudging the FDS flow field towards that computed by WRF, and is currently limited to one way coupling performed in an off-line mode. This approach allows the FDS model to operate as a sub-grid scale model with in a WRF simulation. To test and validate the coupled FDS - WRF model, the methane leak from the Aliso Canyon underground storage facility was simulated. Large eddy simulations were performed over the complex topography of various natural gas storage facilities including Aliso Canyon, Honor Rancho and MacDonald Island at 10 m horizontal and vertical resolution. The goal of these simulations included improving and validating transport models as well as testing leak hypotheses. Forward simulation results were compared with aircraft and tower based in-situ measurements as well as methane plumes observed using the NASA Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) and the next generation instrument AVIRIS-NG. Comparison of simulation results with measurement data demonstrate the capability of the coupled FDS-WRF models to accurately simulate the transport and dispersion of methane plumes over urban domains. Simulated integrated methane enhancements will be presented and

  10. An ice sheet model validation framework for the Greenland ice sheet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.

    We propose a new ice sheet model validation framework the Cryospheric Model Comparison Tool (CMCT) that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quanti- tative metricsmore » for use in evaluating the different model simulations against the observations. We find 10 that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, the model initial condition as well as output from idealized and dynamic models all provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CMCT, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few

  11. An ice sheet model validation framework for the Greenland ice sheet

    DOE PAGES

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; ...

    2017-01-17

    We propose a new ice sheet model validation framework the Cryospheric Model Comparison Tool (CMCT) that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quanti- tative metricsmore » for use in evaluating the different model simulations against the observations. We find 10 that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, the model initial condition as well as output from idealized and dynamic models all provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CMCT, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few

  12. Validation of tsunami inundation model TUNA-RP using OAR-PMEL-135 benchmark problem set

    NASA Astrophysics Data System (ADS)

    Koh, H. L.; Teh, S. Y.; Tan, W. K.; Kh'ng, X. Y.

    2017-05-01

    A standard set of benchmark problems, known as OAR-PMEL-135, is developed by the US National Tsunami Hazard Mitigation Program for tsunami inundation model validation. Any tsunami inundation model must be tested for its accuracy and capability using this standard set of benchmark problems before it can be gainfully used for inundation simulation. The authors have previously developed an in-house tsunami inundation model known as TUNA-RP. This inundation model solves the two-dimensional nonlinear shallow water equations coupled with a wet-dry moving boundary algorithm. This paper presents the validation of TUNA-RP against the solutions provided in the OAR-PMEL-135 benchmark problem set. This benchmark validation testing shows that TUNA-RP can indeed perform inundation simulation with accuracy consistent with that in the tested benchmark problem set.

  13. Validation of a novel virtual reality simulator for robotic surgery.

    PubMed

    Schreuder, Henk W R; Persson, Jan E U; Wolswijk, Richard G H; Ihse, Ingmar; Schijven, Marlies P; Verheijen, René H M

    2014-01-01

    With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were "time to complete" and "economy of motion" (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery.

  14. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results

  15. PSPICE Hybrid Modeling and Simulation of Capacitive Micro-Gyroscopes

    PubMed Central

    Su, Yan; Tong, Xin; Liu, Nan; Han, Guowei; Si, Chaowei; Ning, Jin; Li, Zhaofeng; Yang, Fuhua

    2018-01-01

    With an aim to reduce the cost of prototype development, this paper establishes a PSPICE hybrid model for the simulation of capacitive microelectromechanical systems (MEMS) gyroscopes. This is achieved by modeling gyroscopes in different modules, then connecting them in accordance with the corresponding principle diagram. Systematic simulations of this model are implemented along with a consideration of details of MEMS gyroscopes, including a capacitance model without approximation, mechanical thermal noise, and the effect of ambient temperature. The temperature compensation scheme and optimization of interface circuits are achieved based on the hybrid closed-loop simulation of MEMS gyroscopes. The simulation results show that the final output voltage is proportional to the angular rate input, which verifies the validity of this model. PMID:29597284

  16. Solar power plant performance evaluation: simulation and experimental validation

    NASA Astrophysics Data System (ADS)

    Natsheh, E. M.; Albarbar, A.

    2012-05-01

    In this work the performance of solar power plant is evaluated based on a developed model comprise photovoltaic array, battery storage, controller and converters. The model is implemented using MATLAB/SIMULINK software package. Perturb and observe (P&O) algorithm is used for maximizing the generated power based on maximum power point tracker (MPPT) implementation. The outcome of the developed model are validated and supported by a case study carried out using operational 28.8kW grid-connected solar power plant located in central Manchester. Measurements were taken over 21 month's period; using hourly average irradiance and cell temperature. It was found that system degradation could be clearly monitored by determining the residual (the difference) between the output power predicted by the model and the actual measured power parameters. It was found that the residual exceeded the healthy threshold, 1.7kW, due to heavy snow in Manchester last winter. More important, the developed performance evaluation technique could be adopted to detect any other reasons that may degrade the performance of the P V panels such as shading and dirt. Repeatability and reliability of the developed system performance were validated during this period. Good agreement was achieved between the theoretical simulation and the real time measurement taken the online grid connected solar power plant.

  17. Tidal simulation using regional ocean modeling systems (ROMS)

    NASA Technical Reports Server (NTRS)

    Wang, Xiaochun; Chao, Yi; Li, Zhijin; Dong, Changming; Farrara, John; McWilliams, James C.; Shum, C. K.; Wang, Yu; Matsumoto, Koji; Rosenfeld, Leslie K.; hide

    2006-01-01

    The purpose of our research is to test the capability of ROMS in simulating tides. The research also serves as a necessary exercise to implement tides in an operational ocean forecasting system. In this paper, we emphasize the validation of the model tide simulation. The characteristics and energetics of tides of the region will be reported in separate publications.

  18. Dynamic vehicle-track interaction in switches and crossings and the influence of rail pad stiffness - field measurements and validation of a simulation model

    NASA Astrophysics Data System (ADS)

    Pålsson, Björn A.; Nielsen, Jens C. O.

    2015-06-01

    A model for simulation of dynamic interaction between a railway vehicle and a turnout (switch and crossing, S&C) is validated versus field measurements. In particular, the implementation and accuracy of viscously damped track models with different complexities are assessed. The validation data come from full-scale field measurements of dynamic track stiffness and wheel-rail contact forces in a demonstrator turnout that was installed as part of the INNOTRACK project with funding from the European Union Sixth Framework Programme. Vertical track stiffness at nominal wheel loads, in the frequency range up to 20 Hz, was measured using a rolling stiffness measurement vehicle (RSMV). Vertical and lateral wheel-rail contact forces were measured by an instrumented wheel set mounted in a freight car featuring Y25 bogies. The measurements were performed for traffic in both the through and diverging routes, and in the facing and trailing moves. The full set of test runs was repeated with different types of rail pad to investigate the influence of rail pad stiffness on track stiffness and contact forces. It is concluded that impact loads on the crossing can be reduced by using more resilient rail pads. To allow for vehicle dynamics simulations at low computational cost, the track models are discretised space-variant mass-spring-damper models that are moving with each wheel set of the vehicle model. Acceptable agreement between simulated and measured vertical contact forces at the crossing can be obtained when the standard GENSYS track model is extended with one ballast/subgrade mass under each rail. This model can be tuned to capture the large phase delay in dynamic track stiffness at low frequencies, as measured by the RSMV, while remaining sufficiently resilient at higher frequencies.

  19. Composite Cure Process Modeling and Simulations using COMPRO(Registered Trademark) and Validation of Residual Strains using Fiber Optics Sensors

    NASA Technical Reports Server (NTRS)

    Sreekantamurthy, Thammaiah; Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.

    2016-01-01

    Composite cure process induced residual strains and warping deformations in composite components present significant challenges in the manufacturing of advanced composite structure. As a part of the Manufacturing Process and Simulation initiative of the NASA Advanced Composite Project (ACP), research is being conducted on the composite cure process by developing an understanding of the fundamental mechanisms by which the process induced factors influence the residual responses. In this regard, analytical studies have been conducted on the cure process modeling of composite structural parts with varied physical, thermal, and resin flow process characteristics. The cure process simulation results were analyzed to interpret the cure response predictions based on the underlying physics incorporated into the modeling tool. In the cure-kinetic analysis, the model predictions on the degree of cure, resin viscosity and modulus were interpreted with reference to the temperature distribution in the composite panel part and tool setup during autoclave or hot-press curing cycles. In the fiber-bed compaction simulation, the pore pressure and resin flow velocity in the porous media models, and the compaction strain responses under applied pressure were studied to interpret the fiber volume fraction distribution predictions. In the structural simulation, the effect of temperature on the resin and ply modulus, and thermal coefficient changes during curing on predicted mechanical strains and chemical cure shrinkage strains were studied to understand the residual strains and stress response predictions. In addition to computational analysis, experimental studies were conducted to measure strains during the curing of laminated panels by means of optical fiber Bragg grating sensors (FBGs) embedded in the resin impregnated panels. The residual strain measurements from laboratory tests were then compared with the analytical model predictions. The paper describes the cure process

  20. Temperature field simulation and phantom validation of a Two-armed Spiral Antenna for microwave thermotherapy.

    PubMed

    Du, Yongxing; Zhang, Lingze; Sang, Lulu; Wu, Daocheng

    2016-04-29

    In this paper, an Archimedean planar spiral antenna for the application of thermotherapy was designed. This type of antenna was chosen for its compact structure, flexible application and wide heating area. The temperature field generated by the use of this Two-armed Spiral Antenna in a muscle-equivalent phantom was simulated and subsequently validated by experimentation. First, the specific absorption rate (SAR) of the field was calculated using the Finite Element Method (FEM) by Ansoft's High Frequency Structure Simulation (HFSS). Then, the temperature elevation in the phantom was simulated by an explicit finite difference approximation of the bioheat equation (BHE). The temperature distribution was then validated by a phantom heating experiment. The results showed that this antenna had a good heating ability and a wide heating area. A comparison between the calculation and the measurement showed a fair agreement in the temperature elevation. The validated model could be applied for the analysis of electromagnetic-temperature distribution in phantoms during the process of antenna design or thermotherapy experimentation.

  1. High fidelity, low cost moulage as a valid simulation tool to improve burns education.

    PubMed

    Pywell, M J; Evgeniou, E; Highway, K; Pitt, E; Estela, C M

    2016-06-01

    Simulation allows the opportunity for repeated practice in controlled, safe conditions. Moulage uses materials such as makeup to simulate clinical presentations. Moulage fidelity can be assessed by face validity (realism) and content validity (appropriateness). The aim of this project is to compare the fidelity of professional moulage to non-professional moulage in the context of a burns management course. Four actors were randomly assigned to a professional make-up artist or a course faculty member for moulage preparation such that two actors were in each group. Participants completed the actor-based burn management scenarios and answered a ten-question Likert-scale questionnaire on face and content validity. Mean scores and a linear mixed effects model were used to compare professional and non-professional moulage. Cronbach's alpha assessed internal consistency. Twenty participants experienced three out of four scenarios and at the end of the course completed a total of 60 questionnaires. Professional moulage had higher average ratings for face (4.30 v 3.80; p=0.11) and content (4.30 v 4.00; p=0.06) validity. Internal consistency of face (α=0.91) and content (α=0.85) validity questions was very good. The fidelity of professionally prepared moulage, as assessed by content validity, was higher than non-professionally prepared moulage. We have shown that using professional techniques and low cost materials we can prepare quality high fidelity moulage simulations. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  2. IMPACT: a generic tool for modelling and simulating public health policy.

    PubMed

    Ainsworth, J D; Carruthers, E; Couch, P; Green, N; O'Flaherty, M; Sperrin, M; Williams, R; Asghar, Z; Capewell, S; Buchan, I E

    2011-01-01

    Populations are under-served by local health policies and management of resources. This partly reflects a lack of realistically complex models to enable appraisal of a wide range of potential options. Rising computing power coupled with advances in machine learning and healthcare information now enables such models to be constructed and executed. However, such models are not generally accessible to public health practitioners who often lack the requisite technical knowledge or skills. To design and develop a system for creating, executing and analysing the results of simulated public health and healthcare policy interventions, in ways that are accessible and usable by modellers and policy-makers. The system requirements were captured and analysed in parallel with the statistical method development for the simulation engine. From the resulting software requirement specification the system architecture was designed, implemented and tested. A model for Coronary Heart Disease (CHD) was created and validated against empirical data. The system was successfully used to create and validate the CHD model. The initial validation results show concordance between the simulation results and the empirical data. We have demonstrated the ability to connect health policy-modellers and policy-makers in a unified system, thereby making population health models easier to share, maintain, reuse and deploy.

  3. Influence of drug-light-interval on photodynamic therapy of port wine stains--simulation and validation of mathematic models.

    PubMed

    Huang, Naiyan; Cheng, Gang; Li, Xiaosong; Gu, Ying; Liu, Fanguang; Zhong, Qiuhai; Wang, Ying; Zen, Jin; Qiu, Haixia; Chen, Hongxia

    2008-06-01

    We established mathematical models of photodynamic therapy (PDT) on port wine stains (PWS) to observe the effect of drug-light-interval (DLI) and optimize light dose. The mathematical simulations included determining (1) the distribution of laser light by Monte Carlo model, (2) the change of photosensitizer concentration in PWS vessels by a pharmacokinetics equation, (3) the change of photosensitizer distribution in tissue outside the vessels by a diffuse equation and photobleaching equation, and (4) the change of tissue oxygen concentration by the Fick's law with a consideration of the oxygen consumption during PDT. The concentration of singlet oxygen in the tissue model was calculated by the finite difference method. To validate those models, a PWS lesion of the same patient was divided into two areas and subjected to different DLIs and treated with different energy density. The color of lesion was assessed 8-12 weeks later. The simulation indicated the singlet oxygen concentration of the second treatment area (DLI=40 min) was lower than that of the first treatment area (DLI=0 min). However, it would be increased to a level similar to that of the first treatment area if the light irradiation time of the second treatment area was prolonged from 40 min to 55 min. Clinical results were consistent with the results predicted by the mathematical models. The mathematical models established in this study are helpful to optimize clinical protocol.

  4. Full immersion simulation: validation of a distributed simulation environment for technical and non-technical skills training in Urology.

    PubMed

    Brewin, James; Tang, Jessica; Dasgupta, Prokar; Khan, Muhammad S; Ahmed, Kamran; Bello, Fernando; Kneebone, Roger; Jaye, Peter

    2015-07-01

    To evaluate the face, content and construct validity of the distributed simulation (DS) environment for technical and non-technical skills training in endourology. To evaluate the educational impact of DS for urology training. DS offers a portable, low-cost simulated operating room environment that can be set up in any open space. A prospective mixed methods design using established validation methodology was conducted in this simulated environment with 10 experienced and 10 trainee urologists. All participants performed a simulated prostate resection in the DS environment. Outcome measures included surveys to evaluate the DS, as well as comparative analyses of experienced and trainee urologist's performance using real-time and 'blinded' video analysis and validated performance metrics. Non-parametric statistical methods were used to compare differences between groups. The DS environment demonstrated face, content and construct validity for both non-technical and technical skills. Kirkpatrick level 1 evidence for the educational impact of the DS environment was shown. Further studies are needed to evaluate the effect of simulated operating room training on real operating room performance. This study has shown the validity of the DS environment for non-technical, as well as technical skills training. DS-based simulation appears to be a valuable addition to traditional classroom-based simulation training. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.

  5. Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process

    NASA Astrophysics Data System (ADS)

    Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph

    2012-08-01

    Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.

  6. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  7. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Kevin M.; Smith, Brennan T.; Witt, Adam M.

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  8. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruehl, Kelley; Michelen, Carlos; Bosma, Bret

    2016-08-01

    The Wave Energy Converter Simulator (WEC-Sim) is an open-source code jointly developed by Sandia National Laboratories and the National Renewable Energy Laboratory. It is used to model wave energy converters subjected to operational and extreme waves. In order for the WEC-Sim code to be beneficial to the wave energy community, code verification and physical model validation is necessary. This paper describes numerical modeling of the wave tank testing for the 1:33-scale experimental testing of the floating oscillating surge wave energy converter. The comparison between WEC-Sim and the Phase 1 experimental data set serves as code validation. This paper is amore » follow-up to the WEC-Sim paper on experimental testing, and describes the WEC-Sim numerical simulations for the floating oscillating surge wave energy converter.« less

  9. Simulation, Model Verification and Controls Development of Brayton Cycle PM Alternator: Testing and Simulation of 2 KW PM Generator with Diode Bridge Output

    NASA Technical Reports Server (NTRS)

    Stankovic, Ana V.

    2003-01-01

    Professor Stankovic will be developing and refining Simulink based models of the PM alternator and comparing the simulation results with experimental measurements taken from the unit. Her first task is to validate the models using the experimental data. Her next task is to develop alternative control techniques for the application of the Brayton Cycle PM Alternator in a nuclear electric propulsion vehicle. The control techniques will be first simulated using the validated models then tried experimentally with hardware available at NASA. Testing and simulation of a 2KW PM synchronous generator with diode bridge output is described. The parameters of a synchronous PM generator have been measured and used in simulation. Test procedures have been developed to verify the PM generator model with diode bridge output. Experimental and simulation results are in excellent agreement.

  10. Validation of a Novel Virtual Reality Simulator for Robotic Surgery

    PubMed Central

    Schreuder, Henk W. R.; Persson, Jan E. U.; Wolswijk, Richard G. H.; Ihse, Ingmar; Schijven, Marlies P.; Verheijen, René H. M.

    2014-01-01

    Objective. With the increase in robotic-assisted laparoscopic surgery there is a concomitant rising demand for training methods. The objective was to establish face and construct validity of a novel virtual reality simulator (dV-Trainer, Mimic Technologies, Seattle, WA) for the use in training of robot-assisted surgery. Methods. A comparative cohort study was performed. Participants (n = 42) were divided into three groups according to their robotic experience. To determine construct validity, participants performed three different exercises twice. Performance parameters were measured. To determine face validity, participants filled in a questionnaire after completion of the exercises. Results. Experts outperformed novices in most of the measured parameters. The most discriminative parameters were “time to complete” and “economy of motion” (P < 0.001). The training capacity of the simulator was rated 4.6 ± 0.5 SD on a 5-point Likert scale. The realism of the simulator in general, visual graphics, movements of instruments, interaction with objects, and the depth perception were all rated as being realistic. The simulator is considered to be a very useful training tool for residents and medical specialist starting with robotic surgery. Conclusions. Face and construct validity for the dV-Trainer could be established. The virtual reality simulator is a useful tool for training robotic surgery. PMID:24600328

  11. The Validity of Quasi-Steady-State Approximations in Discrete Stochastic Simulations

    PubMed Central

    Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R.

    2014-01-01

    In biochemical networks, reactions often occur on disparate timescales and can be characterized as either fast or slow. The quasi-steady-state approximation (QSSA) utilizes timescale separation to project models of biochemical networks onto lower-dimensional slow manifolds. As a result, fast elementary reactions are not modeled explicitly, and their effect is captured by nonelementary reaction-rate functions (e.g., Hill functions). The accuracy of the QSSA applied to deterministic systems depends on how well timescales are separated. Recently, it has been proposed to use the nonelementary rate functions obtained via the deterministic QSSA to define propensity functions in stochastic simulations of biochemical networks. In this approach, termed the stochastic QSSA, fast reactions that are part of nonelementary reactions are not simulated, greatly reducing computation time. However, it is unclear when the stochastic QSSA provides an accurate approximation of the original stochastic simulation. We show that, unlike the deterministic QSSA, the validity of the stochastic QSSA does not follow from timescale separation alone, but also depends on the sensitivity of the nonelementary reaction rate functions to changes in the slow species. The stochastic QSSA becomes more accurate when this sensitivity is small. Different types of QSSAs result in nonelementary functions with different sensitivities, and the total QSSA results in less sensitive functions than the standard or the prefactor QSSA. We prove that, as a result, the stochastic QSSA becomes more accurate when nonelementary reaction functions are obtained using the total QSSA. Our work provides an apparently novel condition for the validity of the QSSA in stochastic simulations of biochemical reaction networks with disparate timescales. PMID:25099817

  12. Face validity, construct validity and training benefits of a virtual reality TURP simulator.

    PubMed

    Bright, Elizabeth; Vine, Samuel; Wilson, Mark R; Masters, Rich S W; McGrath, John S

    2012-01-01

    To assess face validity, construct validity and the training benefits of a virtual reality TURP simulator. 11 novices (no TURP experience) and 7 experts (>200 TURP's) completed a virtual reality median lobe prostate resection task on the TURPsim™ (Simbionix USA Corp., Cleveland, OH). Performance indicators (percentage of prostate resected (PR), percentage of capsular resection (CR) and time diathermy loop active without tissue contact (TAWC) were recorded via the TURPsim™ and compared between novices and experts to assess construct validity. Verbal comments provided by experts following task completion were used to assess face validity. Repeated attempts of the task by the novices were analysed to assess the training benefits of the TURPsim™. Experts resected a significantly greater percentage of prostate per minute (p < 0.01) and had significantly less active diathermy time without tissue contact (p < 0.01) than novices. After practice, novices were able to perform the simulation more effectively, with significant improvement in all measured parameters. Improvement in performance was noted in novices following repetitive training, as evidenced by improved TAWC scores that were not significantly different from the expert group (p = 0.18). This study has established face and construct validity for the TURPsim™. The potential benefit in using this tool to train novices has also been demonstrated. Copyright © 2012 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  13. Simulation Model of A Ferroelectric Field Effect Transistor

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Ho, Fat Duen; Russell, Larry W. (Technical Monitor)

    2002-01-01

    An electronic simulation model has been developed of a ferroelectric field effect transistor (FFET). This model can be used in standard electrical circuit simulation programs to simulate the main characteristics of the FFET. The model uses a previously developed algorithm that incorporates partial polarization as a basis for the design. The model has the main characteristics of the FFET, which are the current hysterisis with different gate voltages and decay of the drain current when the gate voltage is off. The drain current has values matching actual FFET's, which were measured experimentally. The input and output resistance in the model is similar to that of the FFET. The model is valid for all frequencies below RF levels. A variety of different ferroelectric material characteristics can be modeled. The model can be used to design circuits using FFET'S with standard electrical simulation packages. The circuit can be used in designing non-volatile memory circuits and logic circuits and is compatible with all SPICE based circuit analysis programs. The model is a drop in library that integrates seamlessly into a SPICE simulation. A comparison is made between the model and experimental data measured from an actual FFET.

  14. Validation Testing of a Peridynamic Impact Damage Model Using NASA's Micro-Particle Gun

    NASA Technical Reports Server (NTRS)

    Baber, Forrest E.; Zelinski, Brian J.; Guven, Ibrahim; Gray, Perry

    2017-01-01

    Through a collaborative effort between the Virginia Commonwealth University and Raytheon, a peridynamic model for sand impact damage has been developed1-3. Model development has focused on simulating impacts of sand particles on ZnS traveling at velocities consistent with aircraft take-off and landing speeds. The model reproduces common features of impact damage including pit and radial cracks, and, under some conditions, lateral cracks. This study focuses on a preliminary validation exercise in which simulation results from the peridynamic model are compared to a limited experimental data set generated by NASA's recently developed micro-particle gun (MPG). The MPG facility measures the dimensions and incoming and rebound velocities of the impact particles. It also links each particle to a specific impact site and its associated damage. In this validation exercise parameters of the peridynamic model are adjusted to fit the experimentally observed pit diameter, average length of radial cracks and rebound velocities for 4 impacts of 300 µm glass beads on ZnS. Results indicate that a reasonable fit of these impact characteristics can be obtained by suitable adjustment of the peridynamic input parameters, demonstrating that the MPG can be used effectively as a validation tool for impact modeling and that the peridynamic sand impact model described herein possesses not only a qualitative but also a quantitative ability to simulate sand impact events.

  15. In silico simulations of experimental protocols for cardiac modeling.

    PubMed

    Carro, Jesus; Rodriguez, Jose Felix; Pueyo, Esther

    2014-01-01

    A mathematical model of the AP involves the sum of different transmembrane ionic currents and the balance of intracellular ionic concentrations. To each ionic current corresponds an equation involving several effects. There are a number of model parameters that must be identified using specific experimental protocols in which the effects are considered as independent. However, when the model complexity grows, the interaction between effects becomes increasingly important. Therefore, model parameters identified considering the different effects as independent might be misleading. In this work, a novel methodology consisting in performing in silico simulations of the experimental protocol and then comparing experimental and simulated outcomes is proposed for parameter model identification and validation. The potential of the methodology is demonstrated by validating voltage-dependent L-type calcium current (ICaL) inactivation in recently proposed human ventricular AP models with different formulations. Our results show large differences between ICaL inactivation as calculated from the model equation and ICaL inactivation from the in silico simulations due to the interaction between effects and/or to the experimental protocol. Our results suggest that, when proposing any new model formulation, consistency between such formulation and the corresponding experimental data that is aimed at being reproduced needs to be first verified considering all involved factors.

  16. Development of NASA's Models and Simulations Standard

    NASA Technical Reports Server (NTRS)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.

    2008-01-01

    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  17. Simulation Model Development for Icing Effects Flight Training

    NASA Technical Reports Server (NTRS)

    Barnhart, Billy P.; Dickes, Edward G.; Gingras, David R.; Ratvasky, Thomas P.

    2003-01-01

    A high-fidelity simulation model for icing effects flight training was developed from wind tunnel data for the DeHavilland DHC-6 Twin Otter aircraft. First, a flight model of the un-iced airplane was developed and then modifications were generated to model the icing conditions. The models were validated against data records from the NASA Twin Otter Icing Research flight test program with only minimal refinements being required. The goals of this program were to demonstrate the effectiveness of such a simulator for training pilots to recognize and recover from icing situations and to establish a process for modeling icing effects to be used for future training devices.

  18. Folding free energy surfaces of three small proteins under crowding: validation of the postprocessing method by direct simulation

    NASA Astrophysics Data System (ADS)

    Qin, Sanbo; Mittal, Jeetain; Zhou, Huan-Xiang

    2013-08-01

    We have developed a ‘postprocessing’ method for modeling biochemical processes such as protein folding under crowded conditions (Qin and Zhou 2009 Biophys. J. 97 12-19). In contrast to the direct simulation approach, in which the protein undergoing folding is simulated along with crowders, the postprocessing method requires only the folding simulation without crowders. The influence of the crowders is then obtained by taking conformations from the crowder-free simulation and calculating the free energies of transferring to the crowders. This postprocessing yields the folding free energy surface of the protein under crowding. Here the postprocessing results for the folding of three small proteins under ‘repulsive’ crowding are validated by those obtained previously by the direct simulation approach (Mittal and Best 2010 Biophys. J. 98 315-20). This validation confirms the accuracy of the postprocessing approach and highlights its distinct advantages in modeling biochemical processes under cell-like crowded conditions, such as enabling an atomistic representation of the test proteins.

  19. Extension and Validation of a Hybrid Particle-Finite Element Method for Hypervelocity Impact Simulation. Chapter 2

    NASA Technical Reports Server (NTRS)

    Fahrenthold, Eric P.; Shivarama, Ravishankar

    2004-01-01

    The hybrid particle-finite element method of Fahrenthold and Horban, developed for the simulation of hypervelocity impact problems, has been extended to include new formulations of the particle-element kinematics, additional constitutive models, and an improved numerical implementation. The extended formulation has been validated in three dimensional simulations of published impact experiments. The test cases demonstrate good agreement with experiment, good parallel speedup, and numerical convergence of the simulation results.

  20. Standing wave design and experimental validation of a tandem simulated moving bed process for insulin purification.

    PubMed

    Xie, Yi; Mun, Sungyong; Kim, Jinhyun; Wang, Nien-Hwa Linda

    2002-01-01

    A tandem simulated moving bed (SMB) process for insulin purification has been proposed and validated experimentally. The mixture to be separated consists of insulin, high molecular weight proteins, and zinc chloride. A systematic approach based on the standing wave design, rate model simulations, and experiments was used to develop this multicomponent separation process. The standing wave design was applied to specify the SMB operating conditions of a lab-scale unit with 10 columns. The design was validated with rate model simulations prior to experiments. The experimental results show 99.9% purity and 99% yield, which closely agree with the model predictions and the standing wave design targets. The agreement proves that the standing wave design can ensure high purity and high yield for the tandem SMB process. Compared to a conventional batch SEC process, the tandem SMB has 10% higher yield, 400% higher throughput, and 72% lower eluant consumption. In contrast, a design that ignores the effects of mass transfer and nonideal flow cannot meet the purity requirement and gives less than 96% yield.

  1. Validation of the Electromagnetic Code FACETS for Numerical Simulation of Radar Target Images

    DTIC Science & Technology

    2009-12-01

    Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong...Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong DRDC Ottawa...for simulating radar images of a target is obtained, through direct simulation-to-measurement comparisons. A 3-dimensional computer-aided design

  2. Modeling and Simulation of Ceramic Arrays to Improve Ballaistic Performance

    DTIC Science & Technology

    2013-10-01

    are modeled using SPH elements. Model validation runs with monolithic SiC tiles are conducted based on the DoP experiments described in reference...TERMS ,30cal AP M2 Projectile, 762x39 PS Projectile, SPH , Aluminum 5083, SiC, DoP Expeminets, AutoDyn Simulations, Tile Gap 16. SECURITY...range 700 m/s to 1000 m/s are modeled using SPH elements. □ Model validation runs with monolithic SiC tiles are conducted based on the DoP

  3. The validity of a professional competence tool for physiotherapy students in simulation-based clinical education: a Rasch analysis.

    PubMed

    Judd, Belinda K; Scanlan, Justin N; Alison, Jennifer A; Waters, Donna; Gordon, Christopher J

    2016-08-05

    Despite the recent widespread adoption of simulation in clinical education in physiotherapy, there is a lack of validated tools for assessment in this setting. The Assessment of Physiotherapy Practice (APP) is a comprehensive tool used in clinical placement settings in Australia to measure professional competence of physiotherapy students. The aim of the study was to evaluate the validity of the APP for student assessment in simulation settings. A total of 1260 APPs were collected, 971 from students in simulation and 289 from students in clinical placements. Rasch analysis was used to examine the construct validity of the APP tool in three different simulation assessment formats: longitudinal assessment over 1 week of simulation; longitudinal assessment over 2 weeks; and a short-form (25 min) assessment of a single simulation scenario. Comparison with APPs from 5 week clinical placements in hospital and clinic-based settings were also conducted. The APP demonstrated acceptable fit to the expectations of the Rasch model for the 1 and 2 week clinical simulations, exhibiting unidimensional properties that were able to distinguish different levels of student performance. For the short-form simulation, nine of the 20 items recorded greater than 25 % of scores as 'not-assessed' by clinical educators which impacted on the suitability of the APP tool in this simulation format. The APP was a valid assessment tool when used in longitudinal simulation formats. A revised APP may be required for assessment in short-form simulation scenarios.

  4. Validating Lung Models Using the ASL 5000 Breathing Simulator.

    PubMed

    Dexter, Amanda; McNinch, Neil; Kaznoch, Destiny; Volsko, Teresa A

    2018-04-01

    This study sought to validate pediatric models with normal and altered pulmonary mechanics. PubMed and CINAHL databases were searched for studies directly measuring pulmonary mechanics of healthy infants and children, infants with severe bronchopulmonary dysplasia and neuromuscular disease. The ASL 5000 was used to construct models using tidal volume (VT), inspiratory time (TI), respiratory rate, resistance, compliance, and esophageal pressure gleaned from literature. Data were collected for a 1-minute period and repeated three times for each model. t tests compared modeled data with data abstracted from the literature. Repeated measures analyses evaluated model performance over multiple iterations. Statistical significance was established at a P value of less than 0.05. Maximum differences of means (experimental iteration mean - clinical standard mean) for TI and VT are the following: term infant without lung disease (TI = 0.09 s, VT = 0.29 mL), severe bronchopulmonary dysplasia (TI = 0.08 s, VT = 0.17 mL), child without lung disease (TI = 0.10 s, VT = 0.17 mL), and child with neuromuscular disease (TI = 0.09 s, VT = 0.57 mL). One-sample testing demonstrated statistically significant differences between clinical controls and VT and TI values produced by the ASL 5000 for each iteration and model (P < 0.01). The greatest magnitude of differences was negligible (VT < 1.6%, TI = 18%) and not clinically relevant. Inconsistencies occurred with the models constructed on the ASL 5000. It was deemed accurate for the study purposes. It is therefore essential to test models and evaluate magnitude of differences before use.

  5. Development and initial validation of a novel smoothed-particle hydrodynamics-based simulation model of trabecular bone penetration by metallic implants.

    PubMed

    Kulper, Sloan A; Fang, Christian X; Ren, Xiaodan; Guo, Margaret; Sze, Kam Y; Leung, Frankie K L; Lu, William W

    2018-04-01

    A novel computational model of implant migration in trabecular bone was developed using smoothed-particle hydrodynamics (SPH), and an initial validation was performed via correlation with experimental data. Six fresh-frozen human cadaveric specimens measuring 10 × 10 × 20 mm were extracted from the proximal femurs of female donors (mean age of 82 years, range 75-90, BV/TV ratios between 17.88% and 30.49%). These specimens were then penetrated under axial loading to a depth of 10 mm with 5 mm diameter cylindrical indenters bearing either flat or sharp/conical tip designs similar to blunt and self-tapping cancellous screws, assigned in a random manner. SPH models were constructed based on microCT scans (17.33 µm) of the cadaveric specimens. Two initial specimens were used for calibration of material model parameters. The remaining four specimens were then simulated in silico using identical material model parameters. Peak forces varied between 92.0 and 365.0 N in the experiments, and 115.5-352.2 N in the SPH simulations. The concordance correlation coefficient between experimental and simulated pairs was 0.888, with a 95%CI of 0.8832-0.8926, a Pearson ρ (precision) value of 0.9396, and a bias correction factor Cb (accuracy) value of 0.945. Patterns of bone compaction were qualitatively similar; both experimental and simulated flat-tipped indenters produced dense regions of compacted material adjacent to the advancing face of the indenter, while sharp-tipped indenters deposited compacted material along their peripheries. Simulations based on SPH can produce accurate predictions of trabecular bone penetration that are useful for characterizing implant performance under high-strain loading conditions. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 36:1114-1123, 2018. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  6. Hydrological and water quality processes simulation by the integrated MOHID model

    NASA Astrophysics Data System (ADS)

    Epelde, Ane; Antiguedad, Iñaki; Brito, David; Eduardo, Jauch; Neves, Ramiro; Sauvage, Sabine; Sánchez-Pérez, José Miguel

    2016-04-01

    Different modelling approaches have been used in recent decades to study the water quality degradation caused by non-point source pollution. In this study, the MOHID fully distributed and physics-based model has been employed to simulate hydrological processes and nitrogen dynamics in a nitrate vulnerable zone: the Alegria River watershed (Basque Country, Northern Spain). The results of this study indicate that the MOHID code is suitable for hydrological processes simulation at the watershed scale, as the model shows satisfactory performance at simulating the discharge (with NSE: 0.74 and 0.76 during calibration and validation periods, respectively). The agronomical component of the code, allowed the simulation of agricultural practices, which lead to adequate crop yield simulation in the model. Furthermore, the nitrogen exportation also shows satisfactory performance (with NSE: 0.64 and 0.69 during calibration and validation periods, respectively). While the lack of field measurements do not allow to evaluate the nutrient cycling processes in depth, it has been observed that the MOHID model simulates the annual denitrification according to general ranges established for agricultural watersheds (in this study, 9 kg N ha-1 year-1). In addition, the model has simulated coherently the spatial distribution of the denitrification process, which is directly linked to the simulated hydrological conditions. Thus, the model has localized the highest rates nearby the discharge zone of the aquifer and also where the aquifer thickness is low. These results evidence the strength of this model to simulate watershed scale hydrological processes as well as the crop production and the agricultural activity derived water quality degradation (considering both nutrient exportation and nutrient cycling processes).

  7. Numerical Simulation of Tuff Dissolution and Precipitation Experiments: Validation of Thermal-Hydrologic-Chemical (THC) Coupled-Process Modeling

    NASA Astrophysics Data System (ADS)

    Dobson, P. F.; Kneafsey, T. J.

    2001-12-01

    As part of an ongoing effort to evaluate THC effects on flow in fractured media, we performed a laboratory experiment and numerical simulations to investigate mineral dissolution and precipitation. To replicate mineral dissolution by condensate in fractured tuff, deionized water equilibrated with carbon dioxide was flowed for 1,500 hours through crushed Yucca Mountain tuff at 94° C. The reacted water was collected and sampled for major dissolved species, total alkalinity, electrical conductivity, and pH. The resulting steady-state fluid composition had a total dissolved solids content of about 140 mg/L; silica was the dominant dissolved constituent. A portion of the steady-state reacted water was flowed at 10.8 mL/hr into a 31.7-cm tall, 16.2-cm wide vertically oriented planar fracture with a hydraulic aperture of 31 microns in a block of welded Topopah Spring tuff that was maintained at 80° C at the top and 130° C at the bottom. The fracture began to seal within five days. A 1-D plug-flow model using the TOUGHREACT code developed at Berkeley Lab was used to simulate mineral dissolution, and a 2-D model was developed to simulate the flow of mineralized water through a planar fracture, where boiling conditions led to mineral precipitation. Predicted concentrations of the major dissolved constituents for the tuff dissolution were within a factor of 2 of the measured average steady-state compositions. The fracture-plugging simulations result in the precipitation of amorphous silica at the base of the boiling front, leading to a hundred-fold decrease in fracture permeability in less than 6 days, consistent with the laboratory experiment. These results help validate the use of the TOUGHREACT code for THC modeling of the Yucca Mountain system. The experiment and simulations indicate that boiling and concomitant precipitation of amorphous silica could cause significant reductions in fracture porosity and permeability on a local scale. The TOUGHREACT code will be used

  8. Model Validation for Propulsion - On the TFNS and LES Subgrid Models for a Bluff Body Stabilized Flame

    NASA Technical Reports Server (NTRS)

    Wey, Thomas

    2017-01-01

    This paper summarizes the reacting results of simulating a bluff body stabilized flame experiment of Volvo Validation Rig using a releasable edition of the National Combustion Code (NCC). The turbulence models selected to investigate the configuration are the sub-grid scaled kinetic energy coupled large eddy simulation (K-LES) and the time-filtered Navier-Stokes (TFNS) simulation. The turbulence chemistry interaction used is linear eddy mixing (LEM).

  9. Validating Human Performance Models of the Future Orion Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Wong, Douglas T.; Walters, Brett; Fairey, Lisa

    2010-01-01

    NASA's Orion Crew Exploration Vehicle (CEV) will provide transportation for crew and cargo to and from destinations in support of the Constellation Architecture Design Reference Missions. Discrete Event Simulation (DES) is one of the design methods NASA employs for crew performance of the CEV. During the early development of the CEV, NASA and its prime Orion contractor Lockheed Martin (LM) strived to seek an effective low-cost method for developing and validating human performance DES models. This paper focuses on the method developed while creating a DES model for the CEV Rendezvous, Proximity Operations, and Docking (RPOD) task to the International Space Station. Our approach to validation was to attack the problem from several fronts. First, we began the development of the model early in the CEV design stage. Second, we adhered strictly to M&S development standards. Third, we involved the stakeholders, NASA astronauts, subject matter experts, and NASA's modeling and simulation development community throughout. Fourth, we applied standard and easy-to-conduct methods to ensure the model's accuracy. Lastly, we reviewed the data from an earlier human-in-the-loop RPOD simulation that had different objectives, which provided us an additional means to estimate the model's confidence level. The results revealed that a majority of the DES model was a reasonable representation of the current CEV design.

  10. Numerical Modeling Studies of Wake Vortices: Real Case Simulations

    NASA Technical Reports Server (NTRS)

    Shen, Shao-Hua; Ding, Feng; Han, Jongil; Lin, Yuh-Lang; Arya, S. Pal; Proctor, Fred H.

    1999-01-01

    A three-dimensional large-eddy simulation model, TASS, is used to simulate the behavior of aircraft wake vortices in a real atmosphere. The purpose for this study is to validate the use of TASS for simulating the decay and transport of wake vortices. Three simulations are performed and the results are compared with the observed data from the 1994-1995 Memphis field experiments. The selected cases have an atmospheric environment of weak turbulence and stable stratification. The model simulations are initialized with appropriate meteorological conditions and a post roll-up vortex system. The behavior of wake vortices as they descend within the atmospheric boundary layer and interact with the ground is discussed.

  11. Irrigated plantations and their effect on energy fluxes in a semi-arid region of Israel - a validated 3-D model simulation

    NASA Astrophysics Data System (ADS)

    Branch, O.; Warrach-Sagi, K.; Wulfmeyer, V.; Cohen, S.

    2013-11-01

    A large irrigated biomass plantation was simulated in an arid region of Israel within the WRF-NOAH coupled atmospheric/land surface model in order to assess land surface atmosphere feedbacks. Simulations were carried out for the 2012 summer season (JJA). The irrigated plantations were simulated by prescribing tailored land surface and soil/plant parameters, and by implementing a newly devised, controllable sub-surface irrigation scheme within NOAH. Two model cases studies were considered and compared - Impact and Control. Impact simulates a hypothetical 10 km × 10 km irrigated plantation. Control represents a baseline and uses the existing land surface data, where the predominant land surface type in the area is bare desert soil. Central to the study is model validation against observations collected for the study over the same period. Surface meteorological and soil observations were made at a desert site and from a 400 ha Simmondsia chinensis (Jojoba) plantation. Control was validated with data from the desert, and Impact from the Jojoba. Finally, estimations were made of the energy balance, applying two Penman-Monteith based methods along with observed meteorological data. These estimations were compared with simulated energy fluxes. Control simulates the daytime desert surface 2 m air temperatures (T2) with less than 0.2 °C deviation and the vapour pressure deficit (VPD) to within 0.25 hPa. Desert wind speed (U) is simulated to within 0.5 m s-1 and the net surface radiation (Rn) to 25 W m-2. Soil heat flux (G) is not so accurately simulated by Control (up to 30 W m-2 deviation) and 5 cm soil temperatures (ST5) are simulated to within 1.5 °C. Impact simulates daytime T2 over irrigated vegetation to within 1-1.5 °C, the VPD to 0.5 hPa, Rn to 50 W m-2 and ST5 to within 2 °C. Simulated Impact G deviates up to 40 W m-2, highlighting a need for re-parameterisation or better soil classification, but the overall contribution to the energy balance is small (5

  12. Multi-scale modelling of supercapacitors: From molecular simulations to a transmission line model

    NASA Astrophysics Data System (ADS)

    Pean, C.; Rotenberg, B.; Simon, P.; Salanne, M.

    2016-09-01

    We perform molecular dynamics simulations of a typical nanoporous-carbon based supercapacitor. The organic electrolyte consists in 1-ethyl-3-methylimidazolium and hexafluorophosphate ions dissolved in acetonitrile. We simulate systems at equilibrium, for various applied voltages. This allows us to determine the relevant thermodynamic (capacitance) and transport (in-pore resistivities) properties. These quantities are then injected in a transmission line model for testing its ability to predict the charging properties of the device. The results from this macroscopic model are in good agreement with non-equilibrium molecular dynamics simulations, which validates its use for interpreting electrochemical impedance experiments.

  13. Development and evaluation of a calibration and validation procedure for microscopic simulation models.

    DOT National Transportation Integrated Search

    2004-01-01

    Microscopic traffic simulation models have been widely accepted and applied in transportation engineering and planning practice for the past decades because simulation is cost-effective, safe, and fast. To achieve high fidelity and credibility for a ...

  14. UAS-Systems Integration, Validation, and Diagnostics Simulation Capability

    NASA Technical Reports Server (NTRS)

    Buttrill, Catherine W.; Verstynen, Harry A.

    2014-01-01

    As part of the Phase 1 efforts of NASA's UAS-in-the-NAS Project a task was initiated to explore the merits of developing a system simulation capability for UAS to address airworthiness certification requirements. The core of the capability would be a software representation of an unmanned vehicle, including all of the relevant avionics and flight control system components. The specific system elements could be replaced with hardware representations to provide Hardware-in-the-Loop (HWITL) test and evaluation capability. The UAS Systems Integration and Validation Laboratory (UAS-SIVL) was created to provide a UAS-systems integration, validation, and diagnostics hardware-in-the-loop simulation capability. This paper discusses how SIVL provides a robust and flexible simulation framework that permits the study of failure modes, effects, propagation paths, criticality, and mitigation strategies to help develop safety, reliability, and design data that can assist with the development of certification standards, means of compliance, and design best practices for civil UAS.

  15. Excavator Design Validation

    NASA Technical Reports Server (NTRS)

    Pholsiri, Chalongrath; English, James; Seberino, Charles; Lim, Yi-Je

    2010-01-01

    The Excavator Design Validation tool verifies excavator designs by automatically generating control systems and modeling their performance in an accurate simulation of their expected environment. Part of this software design includes interfacing with human operations that can be included in simulation-based studies and validation. This is essential for assessing productivity, versatility, and reliability. This software combines automatic control system generation from CAD (computer-aided design) models, rapid validation of complex mechanism designs, and detailed models of the environment including soil, dust, temperature, remote supervision, and communication latency to create a system of high value. Unique algorithms have been created for controlling and simulating complex robotic mechanisms automatically from just a CAD description. These algorithms are implemented as a commercial cross-platform C++ software toolkit that is configurable using the Extensible Markup Language (XML). The algorithms work with virtually any mobile robotic mechanisms using module descriptions that adhere to the XML standard. In addition, high-fidelity, real-time physics-based simulation algorithms have also been developed that include models of internal forces and the forces produced when a mechanism interacts with the outside world. This capability is combined with an innovative organization for simulation algorithms, new regolith simulation methods, and a unique control and study architecture to make powerful tools with the potential to transform the way NASA verifies and compares excavator designs. Energid's Actin software has been leveraged for this design validation. The architecture includes parametric and Monte Carlo studies tailored for validation of excavator designs and their control by remote human operators. It also includes the ability to interface with third-party software and human-input devices. Two types of simulation models have been adapted: high-fidelity discrete

  16. Experimentally validated finite element model of electrocaloric multilayer ceramic structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, N. A. S., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Correia, T. M., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk; Rokosz, M. K., E-mail: nadia.smith@npl.co.uk, E-mail: maciej.rokosz@npl.co.uk, E-mail: tatiana.correia@npl.co.uk

    2014-07-28

    A novel finite element model to simulate the electrocaloric response of a multilayer ceramic capacitor (MLCC) under real environment and operational conditions has been developed. The two-dimensional transient conductive heat transfer model presented includes the electrocaloric effect as a source term, as well as accounting for radiative and convective effects. The model has been validated with experimental data obtained from the direct imaging of MLCC transient temperature variation under application of an electric field. The good agreement between simulated and experimental data, suggests that the novel experimental direct measurement methodology and the finite element model could be used to supportmore » the design of optimised electrocaloric units and operating conditions.« less

  17. Flight Simulator and Training Human Factors Validation

    NASA Technical Reports Server (NTRS)

    Glaser, Scott T.; Leland, Richard

    2009-01-01

    Loss of control has been identified as the leading cause of aircraft accidents in recent years. Efforts have been made to better equip pilots to deal with these types of events, commonly referred to as upsets. A major challenge in these endeavors has been recreating the motion environments found in flight as the majority of upsets take place well beyond the normal operating envelope of large aircraft. The Environmental Tectonics Corporation has developed a simulator motion base, called GYROLAB, that is capable of recreating the sustained accelerations, or G-forces, and motions of flight. A two part research study was accomplished that coupled NASA's Generic Transport Model with a GYROLAB device. The goal of the study was to characterize physiological effects of the upset environment and to demonstrate that a sustained motion based simulator can be an effective means for upset recovery training. Two groups of 25 Air Transport Pilots participated in the study. The results showed reliable signs of pilot arousal at specific stages of similar upsets. Further validation also demonstrated that sustained motion technology was successful in improving pilot performance during recovery following an extensive training program using GYROLAB technology.

  18. Educational Validity of Business Gaming Simulation: A Research Methodology Framework

    ERIC Educational Resources Information Center

    Stainton, Andrew J.; Johnson, Johnnie E.; Borodzicz, Edward P.

    2010-01-01

    Many past educational validity studies of business gaming simulation, and more specifically total enterprise simulation, have been inconclusive. Studies have focused on the weaknesses of business gaming simulation; which is often regarded as an educational medium that has limitations regarding learning effectiveness. However, no attempts have been…

  19. Process Modeling and Dynamic Simulation for EAST Helium Refrigerator

    NASA Astrophysics Data System (ADS)

    Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing

    2016-06-01

    In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)

  20. Training safer orthopedic surgeons. Construct validation of a virtual-reality simulator for hip fracture surgery.

    PubMed

    Akhtar, Kashif; Sugand, Kapil; Sperrin, Matthew; Cobb, Justin; Standfield, Nigel; Gupte, Chinmay

    2015-01-01

    Virtual-reality (VR) simulation in orthopedic training is still in its infancy, and much of the work has been focused on arthroscopy. We evaluated the construct validity of a new VR trauma simulator for performing dynamic hip screw (DHS) fixation of a trochanteric femoral fracture. 30 volunteers were divided into 3 groups according to the number of postgraduate (PG) years and the amount of clinical experience: novice (1-4 PG years; less than 10 DHS procedures); intermediate (5-12 PG years; 10-100 procedures); expert (> 12 PG years; > 100 procedures). Each participant performed a DHS procedure and objective performance metrics were recorded. These data were analyzed with each performance metric taken as the dependent variable in 3 regression models. There were statistically significant differences in performance between groups for (1) number of attempts at guide-wire insertion, (2) total fluoroscopy time, (3) tip-apex distance, (4) probability of screw cutout, and (5) overall simulator score. The intermediate group performed the procedure most quickly, with the lowest fluoroscopy time, the lowest tip-apex distance, the lowest probability of cutout, and the highest simulator score, which correlated with their frequency of exposure to running the trauma lists for hip fracture surgery. This study demonstrates the construct validity of a haptic VR trauma simulator with surgeons undertaking the procedure most frequently performing best on the simulator. VR simulation may be a means of addressing restrictions on working hours and allows trainees to practice technical tasks without putting patients at risk. The VR DHS simulator evaluated in this study may provide valid assessment of technical skill.

  1. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worthmore » of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations

  2. Statistical validation of predictive TRANSP simulations of baseline discharges in preparation for extrapolation to JET D-T

    NASA Astrophysics Data System (ADS)

    Kim, Hyun-Tae; Romanelli, M.; Yuan, X.; Kaye, S.; Sips, A. C. C.; Frassinetti, L.; Buchanan, J.; Contributors, JET

    2017-06-01

    This paper presents for the first time a statistical validation of predictive TRANSP simulations of plasma temperature using two transport models, GLF23 and TGLF, over a database of 80 baseline H-mode discharges in JET-ILW. While the accuracy of the predicted T e with TRANSP-GLF23 is affected by plasma collisionality, the dependency of predictions on collisionality is less significant when using TRANSP-TGLF, indicating that the latter model has a broader applicability across plasma regimes. TRANSP-TGLF also shows a good matching of predicted T i with experimental measurements allowing for a more accurate prediction of the neutron yields. The impact of input data and assumptions prescribed in the simulations are also investigated in this paper. The statistical validation and the assessment of uncertainty level in predictive TRANSP simulations for JET-ILW-DD will constitute the basis for the extrapolation to JET-ILW-DT experiments.

  3. Validity of Cognitive Load Measures in Simulation-Based Training: A Systematic Review.

    PubMed

    Naismith, Laura M; Cavalcanti, Rodrigo B

    2015-11-01

    Cognitive load theory (CLT) provides a rich framework to inform instructional design. Despite the applicability of CLT to simulation-based medical training, findings from multimedia learning have not been consistently replicated in this context. This lack of transferability may be related to issues in measuring cognitive load (CL) during simulation. The authors conducted a review of CLT studies across simulation training contexts to assess the validity evidence for different CL measures. PRISMA standards were followed. For 48 studies selected from a search of MEDLINE, EMBASE, PsycInfo, CINAHL, and ERIC databases, information was extracted about study aims, methods, validity evidence of measures, and findings. Studies were categorized on the basis of findings and prevalence of validity evidence collected, and statistical comparisons between measurement types and research domains were pursued. CL during simulation training has been measured in diverse populations including medical trainees, pilots, and university students. Most studies (71%; 34) used self-report measures; others included secondary task performance, physiological indices, and observer ratings. Correlations between CL and learning varied from positive to negative. Overall validity evidence for CL measures was low (mean score 1.55/5). Studies reporting greater validity evidence were more likely to report that high CL impaired learning. The authors found evidence that inconsistent correlations between CL and learning may be related to issues of validity in CL measures. Further research would benefit from rigorous documentation of validity and from triangulating measures of CL. This can better inform CLT instructional design for simulation-based medical training.

  4. Integrated corridor management (ICM) analysis, modeling, and simulation (AMS) for Minneapolis site : model calibration and validation report.

    DOT National Transportation Integrated Search

    2010-02-01

    This technical report documents the calibration and validation of the baseline (2008) mesoscopic model for the I-394 Minneapolis, Minnesota, Pioneer Site. DynusT was selected as the mesoscopic model for analyzing operating conditions in the I-394 cor...

  5. Simulation and analysis of a model dinoflagellate predator-prey system

    NASA Astrophysics Data System (ADS)

    Mazzoleni, M. J.; Antonelli, T.; Coyne, K. J.; Rossi, L. F.

    2015-12-01

    This paper analyzes the dynamics of a model dinoflagellate predator-prey system and uses simulations to validate theoretical and experimental studies. A simple model for predator-prey interactions is derived by drawing upon analogies from chemical kinetics. This model is then modified to account for inefficiencies in predation. Simulation results are shown to closely match the model predictions. Additional simulations are then run which are based on experimental observations of predatory dinoflagellate behavior, and this study specifically investigates how the predatory dinoflagellate Karlodinium veneficum uses toxins to immobilize its prey and increase its feeding rate. These simulations account for complex dynamics that were not included in the basic models, and the results from these computational simulations closely match the experimentally observed predatory behavior of K. veneficum and reinforce the notion that predatory dinoflagellates utilize toxins to increase their feeding rate.

  6. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.

    2013-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband

  7. Validation of coupled atmosphere-fire behavior models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bossert, J.E.; Reisner, J.M.; Linn, R.R.

    1998-12-31

    Recent advances in numerical modeling and computer power have made it feasible to simulate the dynamical interaction and feedback between the heat and turbulence induced by wildfires and the local atmospheric wind and temperature fields. At Los Alamos National Laboratory, the authors have developed a modeling system that includes this interaction by coupling a high resolution atmospheric dynamics model, HIGRAD, with a fire behavior model, BEHAVE, to predict the spread of wildfires. The HIGRAD/BEHAVE model is run at very high resolution to properly resolve the fire/atmosphere interaction. At present, these coupled wildfire model simulations are computationally intensive. The additional complexitymore » of these models require sophisticated methods for assuring their reliability in real world applications. With this in mind, a substantial part of the research effort is directed at model validation. Several instrumented prescribed fires have been conducted with multi-agency support and participation from chaparral, marsh, and scrub environments in coastal areas of Florida and inland California. In this paper, the authors first describe the data required to initialize the components of the wildfire modeling system. Then they present results from one of the Florida fires, and discuss a strategy for further testing and improvement of coupled weather/wildfire models.« less

  8. Implementation of WirelessHART in the NS-2 Simulator and Validation of Its Correctness

    PubMed Central

    Zand, Pouria; Mathews, Emi; Havinga, Paul; Stojanovski, Spase; Sisinni, Emiliano; Ferrari, Paolo

    2014-01-01

    One of the first standards in the wireless sensor networks domain, WirelessHART (HART (Highway Addressable Remote Transducer)), was introduced to address industrial process automation and control requirements. This standard can be used as a reference point to evaluate other wireless protocols in the domain of industrial monitoring and control. This makes it worthwhile to set up a reliable WirelessHART simulator in order to achieve that reference point in a relatively easy manner. Moreover, it offers an alternative to expensive testbeds for testing and evaluating the performance of WirelessHART. This paper explains our implementation of WirelessHART in the NS-2 network simulator. According to our knowledge, this is the first implementation that supports the WirelessHART network manager, as well as the whole stack (all OSI (Open Systems Interconnection model) layers) of the WirelessHART standard. It also explains our effort to validate the correctness of our implementation, namely through the validation of the implementation of the WirelessHART stack protocol and of the network manager. We use sniffed traffic from a real WirelessHART testbed installed in the Idrolab plant for these validations. This confirms the validity of our simulator. Empirical analysis shows that the simulated results are nearly comparable to the results obtained from real networks. We also demonstrate the versatility and usability of our implementation by providing some further evaluation results in diverse scenarios. For example, we evaluate the performance of the WirelessHART network by applying incremental interference in a multi-hop network. PMID:24841245

  9. Validating Pseudo-dynamic Source Models against Observed Ground Motion Data at the SCEC Broadband Platform, Ver 16.5

    NASA Astrophysics Data System (ADS)

    Song, S. G.

    2016-12-01

    Simulation-based ground motion prediction approaches have several benefits over empirical ground motion prediction equations (GMPEs). For instance, full 3-component waveforms can be produced and site-specific hazard analysis is also possible. However, it is important to validate them against observed ground motion data to confirm their efficiency and validity before practical uses. There have been community efforts for these purposes, which are supported by the Broadband Platform (BBP) project at the Southern California Earthquake Center (SCEC). In the simulation-based ground motion prediction approaches, it is a critical element to prepare a possible range of scenario rupture models. I developed a pseudo-dynamic source model for Mw 6.5-7.0 by analyzing a number of dynamic rupture models, based on 1-point and 2-point statistics of earthquake source parameters (Song et al. 2014; Song 2016). In this study, the developed pseudo-dynamic source models were tested against observed ground motion data at the SCEC BBP, Ver 16.5. The validation was performed at two stages. At the first stage, simulated ground motions were validated against observed ground motion data for past events such as the 1992 Landers and 1994 Northridge, California, earthquakes. At the second stage, they were validated against the latest version of empirical GMPEs, i.e., NGA-West2. The validation results show that the simulated ground motions produce ground motion intensities compatible with observed ground motion data at both stages. The compatibility of the pseudo-dynamic source models with the omega-square spectral decay and the standard deviation of the simulated ground motion intensities are also discussed in the study

  10. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    PubMed Central

    Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko

    2015-01-01

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982

  11. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    PubMed

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  12. Simulation of a tethered microgravity robot pair and validation on a planar air bearing

    NASA Astrophysics Data System (ADS)

    Mantellato, R.; Lorenzini, E. C.; Sternberg, D.; Roascio, D.; Saenz-Otero, A.; Zachrau, H. J.

    2017-09-01

    A software model has been developed to simulate the on-orbit dynamics of a dual-mass tethered system where one or both of the tethered spacecraft are able to produce propulsive thrust. The software simulates translations and rotations of both spacecraft, with the visco-elastic tether being simulated as a lumped-mass model. Thanks to this last feature, tether longitudinal and lateral modes of vibration and tether tension can be accurately assessed. Also, the way the spacecraft motion responds to sudden tether tension spikes can be studied in detail. The code enables the simulation of different scenarios, including space tug missions for deorbit maneuvers in a debris mitigation context and general-purpose tethered formation flight missions. This study aims to validate the software through a representative test campaign performed with the MIT Synchronized Position Hold Engage and Reorient Experimental Satellites (SPHERES) planar air bearing system. Results obtained with the numerical simulator are compared with data from direct measurements in different testing setups. The studied cases take into account different initial conditions of the spacecraft velocities and relative attitudes, and thrust forces. Data analysis is presented comparing the results of the simulations with direct measurements of acceleration and Azimuth rate of the two bodies in the planar air bearing test facility using a Nylon tether. Plans for conducting a microgravity test campaign using the SPHERES satellites aboard the International Space Station are also being scheduled in the near future in order to further validate the simulation using data from the relevant operational environment of extended microgravity with full six degree of freedom (per body) motion.

  13. Development and Validation of a Mobile Device-based External Ventricular Drain Simulator.

    PubMed

    Morone, Peter J; Bekelis, Kimon; Root, Brandon K; Singer, Robert J

    2017-10-01

    Multiple external ventricular drain (EVD) simulators have been created, yet their cost, bulky size, and nonreusable components limit their accessibility to residency programs. To create and validate an animated EVD simulator that is accessible on a mobile device. We developed a mobile-based EVD simulator that is compatible with iOS (Apple Inc., Cupertino, California) and Android-based devices (Google, Mountain View, California) and can be downloaded from the Apple App and Google Play Store. Our simulator consists of a learn mode, which teaches users the procedure, and a test mode, which assesses users' procedural knowledge. Twenty-eight participants, who were divided into expert and novice categories, completed the simulator in test mode and answered a postmodule survey. This was graded using a 5-point Likert scale, with 5 representing the highest score. Using the survey results, we assessed the module's face and content validity, whereas construct validity was evaluated by comparing the expert and novice test scores. Participants rated individual survey questions pertaining to face and content validity a median score of 4 out of 5. When comparing test scores, generated by the participants completing the test mode, the experts scored higher than the novices (mean, 71.5; 95% confidence interval, 69.2 to 73.8 vs mean, 48; 95% confidence interval, 44.2 to 51.6; P < .001). We created a mobile-based EVD simulator that is inexpensive, reusable, and accessible. Our results demonstrate that this simulator is face, content, and construct valid. Copyright © 2017 by the Congress of Neurological Surgeons

  14. Face and content validity of the virtual reality simulator 'ScanTrainer®'.

    PubMed

    Alsalamah, Amal; Campo, Rudi; Tanos, Vasilios; Grimbizis, Gregoris; Van Belle, Yves; Hood, Kerenza; Pugh, Neil; Amso, Nazar

    2017-01-01

    Ultrasonography is a first-line imaging in the investigation of women's irregular bleeding and other gynaecological pathologies, e.g. ovarian cysts and early pregnancy problems. However, teaching ultrasound, especially transvaginal scanning, remains a challenge for health professionals. New technology such as simulation may potentially facilitate and expedite the process of learning ultrasound. Simulation may prove to be realistic, very close to real patient scanning experience for the sonographer and objectively able to assist the development of basic skills such as image manipulation, hand-eye coordination and examination technique. The aim of this study was to determine the face and content validity of a virtual reality simulator (ScanTrainer®, MedaPhor plc, Cardiff, Wales, UK) as reflective of real transvaginal ultrasound (TVUS) scanning. A questionnaire with 14 simulator-related statements was distributed to a number of participants with differing levels of sonography experience in order to determine the level of agreement between the use of the simulator in training and real practice. There were 36 participants: novices ( n  = 25) and experts ( n  = 11) who rated the simulator. Median scores of face validity statements between experts and non-experts using a 10-point visual analogue scale (VAS) ratings ranged between 7.5 and 9.0 ( p  > 0.05) indicated a high level of agreement. Experts' median scores of content validity statements ranged from 8.4 to 9.0. The findings confirm that the simulator has the feel and look of real-time scanning with high face validity. Similarly, its tutorial structures and learning steps confirm the content validity.

  15. Smooth particle hydrodynamic modeling and validation for impact bird substitution

    NASA Astrophysics Data System (ADS)

    Babu, Arun; Prasad, Ganesh

    2018-04-01

    Bird strike events incidentally occur and can at times be fatal for air frame structures. Federal Aviation Regulations (FAR) and such other ones mandates aircrafts to be modeled to withstand various levels of bird hit damages. The subject matter of this paper is numerical modeling of a soft body geometry for realistically substituting an actual bird for carrying out simulations of bird hit on target structures. Evolution of such a numerical code to effect an actual bird behavior through impact is much desired for making use of the state of the art computational facilities in simulating bird strike events. Validity, of simulations depicting bird hits, is largely dependent on the correctness of the bird model. In an impact, a set of complex and coupled dynamic interaction exists between the target and the impactor. To simplify this problem, impactor response needs to be decoupled from that of the target. This can be done by assuming and modeling the target as noncompliant. Bird is assumed as fluidic in a impact. Generated stresses in the bird body are significant than its yield stresses. Hydrodynamic theory is most ideal for describing this problem. Impactor literally flows steadily over the target for most part of this problem. The impact starts with an initial shock and falls into a radial release shock regime. Subsequently a steady flow is established in the bird body and this phase continues till the whole length of the bird body is turned around. Initial shock pressure and steady state pressure are ideal variables for comparing and validating the bird model. Spatial discretization of the bird is done using Smooth Particle Hydrodynamic (SPH) approach. This Discrete Element Model (DEM) offers significant advantages over other contemporary approaches. Thermodynamic state variable relations are established using Polynomial Equation of State (EOS). ANSYS AUTODYN is used to perform the explicit dynamic simulation of the impact event. Validation of the shock and steady

  16. Advanced simulation model for IPM motor drive with considering phase voltage and stator inductance

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Myung; Park, Hyun-Jong; Lee, Ju

    2016-10-01

    This paper proposes an advanced simulation model of driving system for Interior Permanent Magnet (IPM) BrushLess Direct Current (BLDC) motors driven by 120-degree conduction method (two-phase conduction method, TPCM) that is widely used for sensorless control of BLDC motors. BLDC motors can be classified as SPM (Surface mounted Permanent Magnet) and IPM motors. Simulation model of driving system with SPM motors is simple due to the constant stator inductance regardless of the rotor position. Simulation models of SPM motor driving system have been proposed in many researches. On the other hand, simulation models for IPM driving system by graphic-based simulation tool such as Matlab/Simulink have not been proposed. Simulation study about driving system of IPMs with TPCM is complex because stator inductances of IPM vary with the rotor position, as permanent magnets are embedded in the rotor. To develop sensorless scheme or improve control performance, development of control algorithm through simulation study is essential, and the simulation model that accurately reflects the characteristic of IPM is required. Therefore, this paper presents the advanced simulation model of IPM driving system, which takes into account the unique characteristic of IPM due to the position-dependent inductances. The validity of the proposed simulation model is validated by comparison to experimental and simulation results using IPM with TPCM control scheme.

  17. LADAR Performance Simulations with a High Spectral Resolution Atmospheric Transmittance and Radiance Model-LEEDR

    DTIC Science & Technology

    2012-03-01

    such as FASCODE is accomplished. The assessment is limited by the correctness of the models used; validating the models is beyond the scope of this...comparisons with other models and validation against data sets (Snell et al. 2000). 2.3.2 Previous Research Several LADAR simulations have been produced...performance models would better capture the atmosphere physics and climatological effects on these systems. Also, further validation needs to be performed

  18. Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches

    NASA Astrophysics Data System (ADS)

    Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia

    2017-10-01

    With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.

  19. Verification and Validation of EnergyPlus Phase Change Material Model for Opaque Wall Assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.

    2012-08-01

    Phase change materials (PCMs) represent a technology that may reduce peak loads and HVAC energy consumption in buildings. A few building energy simulation programs have the capability to simulate PCMs, but their accuracy has not been completely tested. This study shows the procedure used to verify and validate the PCM model in EnergyPlus using a similar approach as dictated by ASHRAE Standard 140, which consists of analytical verification, comparative testing, and empirical validation. This process was valuable, as two bugs were identified and fixed in the PCM model, and version 7.1 of EnergyPlus will have a validated PCM model. Preliminarymore » results using whole-building energy analysis show that careful analysis should be done when designing PCMs in homes, as their thermal performance depends on several variables such as PCM properties and location in the building envelope.« less

  20. Validation studies of the DOE-2 Building Energy Simulation Program. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, R.; Winkelmann, F.

    1998-06-01

    This report documents many of the validation studies (Table 1) of the DOE-2 building energy analysis simulation program that have taken place since 1981. Results for several versions of the program are presented with the most recent study conducted in 1996 on version DOE-2.1E and the most distant study conducted in 1981 on version DOE-1.3. This work is part of an effort related to continued development of DOE-2, particularly in its use as a simulation engine for new specialized versions of the program such as the recently released RESFEN 3.1. RESFEN 3.1 is a program specifically dealing with analyzing themore » energy performance of windows in residential buildings. The intent in providing the results of these validation studies is to give potential users of the program a high degree of confidence in the calculated results. Validation studies in which calculated simulation data is compared to measured data have been conducted throughout the development of the DOE-2 program. Discrepancies discovered during the course of such work has resulted in improvements in the simulation algorithms. Table 2 provides a listing of additions and modifications that have been made to various versions of the program since version DOE-2.1A. One of the most significant recent changes in the program occurred with version DOE-2.1E. An improved algorithm for calculating the outside surface film coefficient was implemented. In addition, integration of the WINDOW 4 program was accomplished resulting in improved ability in analyzing window energy performance. Validation and verification of a program as sophisticated as DOE-2 must necessarily be limited because of the approximations inherent in the program. For example, the most accurate model of the heat transfer processes in a building would include a three-dimensional analysis. To justify such detailed algorithmic procedures would correspondingly require detailed information describing the building and/or HVAC system and energy plant

  1. Modeling Heat Loss through Piston and Effects of Thermal Boundary Coatings in Diesel Engine Simulations using Conjugate Heat Transfer models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kundu, Prithwish; Scarcelli, Riccardo; Som, Sibendu

    Heat loss through wall boundaries play a dominant role in the overall performance and efficiency of internal combustion engines. Typical engine simulations use constant temperature wall boundary conditions. These boundary conditions cannot be estimated accurately from experiments due to the complexities involved with engine combustion. As a result they introduce a large uncertainty in engine simulations and serve as a tuning parameter. Modeling the process of heat transfer through the solid walls in an unsteady engine computational fluid dynamics (CFD) simulation can lead to the development of higher fidelity engine calculations. These models can be used to study the impactmore » of heat loss on engine efficiency and explore new design methodologies that can reduce heat losses. In this work, a single cylinder diesel engine is modeled along with the solid piston coupled to the fluid domain. Conjugate heat transfer (CHT) modeling techniques were implemented to model heat losses for a full cycle of a Navistar diesel engine. This CFD model is then validated against experimental data available from thermocouples embedded inside the piston surface. The overall predictions from the model match closely with the experimental observations. The validated model is further used to explore the benefits of thermal barrier coatings (TBC) on piston bowls. The effect of TBC coatings were modeled as a thermal resistance in the heat transfer models. Full cycle 3D engine simulations provide quantitative insights into heat loss and thus calculate the efficiency gain by the use of TBC coatings. The work establishes a validated modeling framework for CHT modeling in reciprocating engine simulations.« less

  2. Using stroboscopic flow imaging to validate large-scale computational fluid dynamics simulations

    NASA Astrophysics Data System (ADS)

    Laurence, Ted A.; Ly, Sonny; Fong, Erika; Shusteff, Maxim; Randles, Amanda; Gounley, John; Draeger, Erik

    2017-02-01

    The utility and accuracy of computational modeling often requires direct validation against experimental measurements. The work presented here is motivated by taking a combined experimental and computational approach to determine the ability of large-scale computational fluid dynamics (CFD) simulations to understand and predict the dynamics of circulating tumor cells in clinically relevant environments. We use stroboscopic light sheet fluorescence imaging to track the paths and measure the velocities of fluorescent microspheres throughout a human aorta model. Performed over complex physiologicallyrealistic 3D geometries, large data sets are acquired with microscopic resolution over macroscopic distances.

  3. Statistical Validation of a New Python-based Military Workforce Simulation Model

    DTIC Science & Technology

    2014-12-30

    also having a straightforward syntax that is accessible to non-programmers. Furthermore, it is supported by an impressive variety of scientific... accessed by a given element of model logic or line of code. For example, in Arena, data arrays, queues and the simulation clock are part of the...global scope and are therefore accessible anywhere in the model. The disadvantage of scopes is that all names in a scope must be unique. If more than

  4. Modeling pedestrian shopping behavior using principles of bounded rationality: model comparison and validation

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Timmermans, Harry

    2011-06-01

    Models of geographical choice behavior have been dominantly based on rational choice models, which assume that decision makers are utility-maximizers. Rational choice models may be less appropriate as behavioral models when modeling decisions in complex environments in which decision makers may simplify the decision problem using heuristics. Pedestrian behavior in shopping streets is an example. We therefore propose a modeling framework for pedestrian shopping behavior incorporating principles of bounded rationality. We extend three classical heuristic rules (conjunctive, disjunctive and lexicographic rule) by introducing threshold heterogeneity. The proposed models are implemented using data on pedestrian behavior in Wang Fujing Street, the city center of Beijing, China. The models are estimated and compared with multinomial logit models and mixed logit models. Results show that the heuristic models are the best for all the decisions that are modeled. Validation tests are carried out through multi-agent simulation by comparing simulated spatio-temporal agent behavior with the observed pedestrian behavior. The predictions of heuristic models are slightly better than those of the multinomial logit models.

  5. Validation and optimization of SST k-ω turbulence model for pollutant dispersion within a building array

    NASA Astrophysics Data System (ADS)

    Yu, Hesheng; Thé, Jesse

    2016-11-01

    The prediction of the dispersion of air pollutants in urban areas is of great importance to public health, homeland security, and environmental protection. Computational Fluid Dynamics (CFD) emerges as an effective tool for pollutant dispersion modelling. This paper reports and quantitatively validates the shear stress transport (SST) k-ω turbulence closure model and its transitional variant for pollutant dispersion under complex urban environment for the first time. Sensitivity analysis is performed to establish recommendation for the proper use of turbulence models in urban settings. The current SST k-ω simulation is validated rigorously by extensive experimental data using hit rate for velocity components, and the "factor of two" of observations (FAC2) and fractional bias (FB) for concentration field. The simulation results show that current SST k-ω model can predict flow field nicely with an overall hit rate of 0.870, and concentration dispersion with FAC2 = 0.721 and FB = 0.045. The flow simulation of the current SST k-ω model is slightly inferior to that of a detached eddy simulation (DES), but better than that of standard k-ε model. However, the current study is the best among these three model approaches, when validated against measurements of pollutant dispersion in the atmosphere. This work aims to provide recommendation for proper use of CFD to predict pollutant dispersion in urban environment.

  6. Models Robustness for Simulating Drainage and NO3-N Fluxes

    NASA Astrophysics Data System (ADS)

    Jabro, Jay; Jabro, Ann

    2013-04-01

    Computer models simulate and forecast appropriate agricultural practices to reduce environmental impact. The objectives of this study were to assess and compare robustness and performance of three models -- LEACHM, NCSWAP, and SOIL-SOILN--for simulating drainage and NO3-N leaching fluxes in an intense pasture system without recalibration. A 3-yr study was conducted on a Hagerstown silt loam to measure drainage and NO3-N fluxes below 1 m depth from N-fertilized orchardgrass using intact core lysimeters. Five N-fertilizer treatments were replicated five times in a randomized complete block experimental design. The models were validated under orchardgrass using soil, water and N transformation rate parameters and C pools fractionation derived from a previous study conducted on similar soils under corn. The model efficiency (MEF) of drainage and NO3-N fluxes were 0.53, 0.69 for LEACHM; 0.75, 0.39 for NCSWAP; and 0.94, 0.91for SOIL-SOILN. The models failed to produce reasonable simulations of drainage and NO3-N fluxes in January, February and March due to limited water movement associated with frozen soil and snow accumulation and melt. The differences between simulated and measured NO3-N leaching and among models' performances may also be related to soil N and C transformation processes embedded in the models These results are a monumental progression in the validation of computer models which will lead to continued diffusion across diverse stakeholders.

  7. Criterion for evaluating the predictive ability of nonlinear regression models without cross-validation.

    PubMed

    Kaneko, Hiromasa; Funatsu, Kimito

    2013-09-23

    We propose predictive performance criteria for nonlinear regression models without cross-validation. The proposed criteria are the determination coefficient and the root-mean-square error for the midpoints between k-nearest-neighbor data points. These criteria can be used to evaluate predictive ability after the regression models are updated, whereas cross-validation cannot be performed in such a situation. The proposed method is effective and helpful in handling big data when cross-validation cannot be applied. By analyzing data from numerical simulations and quantitative structural relationships, we confirm that the proposed criteria enable the predictive ability of the nonlinear regression models to be appropriately quantified.

  8. Validation of a virtual reality-based simulator for shoulder arthroscopy.

    PubMed

    Rahm, Stefan; Germann, Marco; Hingsammer, Andreas; Wieser, Karl; Gerber, Christian

    2016-05-01

    This study was to determine face and construct validity of a new virtual reality-based shoulder arthroscopy simulator which uses passive haptic feedback. Fifty-one participants including 25 novices (<20 shoulder arthroscopies) and 26 experts (>100 shoulder arthroscopies) completed two tests: for assessment of face validity, a questionnaire was filled out concerning quality of simulated reality and training potential using a 7-point Likert scale (range 1-7). Construct validity was tested by comparing simulator metrics (operation time in seconds, camera and grasper pathway in centimetre and grasper openings) between novices and experts test results. Overall simulated reality was rated high with a median value of 5.5 (range 2.8-7) points. Training capacity scored a median value of 5.8 (range 3-7) points. Experts were significantly faster in the diagnostic test with a median of 91 (range 37-208) s than novices with 1177 (range 81-383) s (p < 0.0001) and in the therapeutic test 102 (range 58-283) s versus 229 (range 114-399) s (p < 0.0001). Similar results were seen in the other metric values except in the camera pathway in the therapeutic test. The tested simulator achieved high scores in terms of realism and training capability. It reliably discriminated between novices and experts. Further improvements of the simulator, especially in the field of therapeutic arthroscopy, might improve its value as training and assessment tool for shoulder arthroscopy skills. II.

  9. Challenges of NDE Simulation Tool Challenges of NDE Simulation Tool

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.; Frank, Ashley L.

    2015-01-01

    Realistic nondestructive evaluation (NDE) simulation tools enable inspection optimization and predictions of inspectability for new aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of advanced aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation cannot rapidly simulate damage detection techniques for large scale, complex geometry composite components/vehicles with realistic damage types. This paper discusses some of the challenges of model development and validation for composites, such as the level of realism and scale of simulation needed for NASA' applications. Ongoing model development work is described along with examples of model validation studies. The paper will also discuss examples of the use of simulation tools at NASA to develop new damage characterization methods, and associated challenges of validating those methods.

  10. Application of constrained k-means clustering in ground motion simulation validation

    NASA Astrophysics Data System (ADS)

    Khoshnevis, N.; Taborda, R.

    2017-12-01

    The validation of ground motion synthetics has received increased attention over the last few years due to the advances in physics-based deterministic and hybrid simulation methods. Unlike for low frequency simulations (f ≤ 0.5 Hz), for which it has become reasonable to expect a good match between synthetics and data, in the case of high-frequency simulations (f ≥ 1 Hz) it is not possible to match results on a wiggle-by-wiggle basis. This is mostly due to the various complexities and uncertainties involved in earthquake ground motion modeling. Therefore, in order to compare synthetics with data we turn to different time series metrics, which are used as a means to characterize how the synthetics match the data on qualitative and statistical sense. In general, these metrics provide GOF scores that measure the level of similarity in the time and frequency domains. It is common for these scores to be scaled from 0 to 10, with 10 representing a perfect match. Although using individual metrics for particular applications is considered more adequate, there is no consensus or a unified method to classify the comparison between a set of synthetic and recorded seismograms when the various metrics offer different scores. We study the relationship among these metrics through a constrained k-means clustering approach. We define 4 hypothetical stations with scores 3, 5, 7, and 9 for all metrics. We put these stations in the category of cannot-link constraints. We generate the dataset through the validation of the results from a deterministic (physics-based) ground motion simulation for a moderate magnitude earthquake in the greater Los Angeles basin using three velocity models. The maximum frequency of the simulation is 4 Hz. The dataset involves over 300 stations and 11 metrics, or features, as they are understood in the clustering process, where the metrics form a multi-dimensional space. We address the high-dimensional feature effects with a subspace-clustering analysis

  11. Validation of hydrogen gas stratification and mixing models

    DOE PAGES

    Wu, Hsingtzu; Zhao, Haihua

    2015-05-26

    Two validation benchmarks confirm that the BMIX++ code is capable of simulating unintended hydrogen release scenarios efficiently. The BMIX++ (UC Berkeley mechanistic MIXing code in C++) code has been developed to accurately and efficiently predict the fluid mixture distribution and heat transfer in large stratified enclosures for accident analyses and design optimizations. The BMIX++ code uses a scaling based one-dimensional method to achieve large reduction in computational effort compared to a 3-D computational fluid dynamics (CFD) simulation. Two BMIX++ benchmark models have been developed. One is for a single buoyant jet in an open space and another is for amore » large sealed enclosure with both a jet source and a vent near the floor. Both of them have been validated by comparisons with experimental data. Excellent agreements are observed. The entrainment coefficients of 0.09 and 0.08 are found to fit the experimental data for hydrogen leaks with the Froude number of 99 and 268 best, respectively. In addition, the BIX++ simulation results of the average helium concentration for an enclosure with a vent and a single jet agree with the experimental data within a margin of about 10% for jet flow rates ranging from 1.21 × 10⁻⁴ to 3.29 × 10⁻⁴ m³/s. In conclusion, computing time for each BMIX++ model with a normal desktop computer is less than 5 min.« less

  12. Fracture simulation of restored teeth using a continuum damage mechanics failure model.

    PubMed

    Li, Haiyan; Li, Jianying; Zou, Zhenmin; Fok, Alex Siu-Lun

    2011-07-01

    The aim of this paper is to validate the use of a finite-element (FE) based continuum damage mechanics (CDM) failure model to simulate the debonding and fracture of restored teeth. Fracture testing of plastic model teeth, with or without a standard Class-II MOD (mesial-occusal-distal) restoration, was carried out to investigate their fracture behavior. In parallel, 2D FE models of the teeth are constructed and analyzed using the commercial FE software ABAQUS. A CDM failure model, implemented into ABAQUS via the user element subroutine (UEL), is used to simulate the debonding and/or final fracture of the model teeth under a compressive load. The material parameters needed for the CDM model to simulate fracture are obtained through separate mechanical tests. The predicted results are then compared with the experimental data of the fracture tests to validate the failure model. The failure processes of the intact and restored model teeth are successfully reproduced by the simulation. However, the fracture parameters obtained from testing small specimens need to be adjusted to account for the size effect. The results indicate that the CDM model is a viable model for the prediction of debonding and fracture in dental restorations. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  13. New Monte Carlo model of cylindrical diffusing fibers illustrates axially heterogeneous fluorescence detection: simulation and experimental validation

    PubMed Central

    Baran, Timothy M.; Foster, Thomas H.

    2011-01-01

    We present a new Monte Carlo model of cylindrical diffusing fibers that is implemented with a graphics processing unit. Unlike previously published models that approximate the diffuser as a linear array of point sources, this model is based on the construction of these fibers. This allows for accurate determination of fluence distributions and modeling of fluorescence generation and collection. We demonstrate that our model generates fluence profiles similar to a linear array of point sources, but reveals axially heterogeneous fluorescence detection. With axially homogeneous excitation fluence, approximately 90% of detected fluorescence is collected by the proximal third of the diffuser for μs'/μa = 8 in the tissue and 70 to 88% is collected in this region for μs'/μa = 80. Increased fluorescence detection by the distal end of the diffuser relative to the center section is also demonstrated. Validation of these results was performed by creating phantoms consisting of layered fluorescent regions. Diffusers were inserted into these layered phantoms and fluorescence spectra were collected. Fits to these spectra show quantitative agreement between simulated fluorescence collection sensitivities and experimental results. These results will be applicable to the use of diffusers as detectors for dosimetry in interstitial photodynamic therapy. PMID:21895311

  14. Facebook's personal page modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    In this paper we will try to define the utility of Facebook's Personal Page marketing method. This tool that Facebook provides, is modelled and simulated using iThink in the context of a Facebook marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following model has been developed for a social media marketing agent/company, Facebook platform oriented and tested in real circumstances. This model is finalized through a number of revisions and iterators of the design, development, simulation, testing and evaluation processes. The validity and usefulness of this Facebook marketing model for the day-to-day decision making are authenticated by the management of the company organization. Facebook's Personal Page method can be adjusted, depending on the situation, in order to maximize the total profit of the company which is to bring new customers, keep the interest of the old customers and deliver traffic to its website.

  15. Using Dynamic Interface Modeling and Simulation to Develop a Launch and Recovery Flight Simulation for a UH-60A Blackhawk

    NASA Technical Reports Server (NTRS)

    Sweeney, Christopher; Bunnell, John; Chung, William; Giovannetti, Dean; Mikula, Julie; Nicholson, Bob; Roscoe, Mike

    2001-01-01

    Joint Shipboard Helicopter Integration Process (JSHIP) is a Joint Test and Evaluation (JT&E) program sponsored by the Office of the Secretary of Defense (OSD). Under the JSHDP program is a simulation effort referred to as the Dynamic Interface Modeling and Simulation System (DIMSS). The purpose of DIMSS is to develop and test the processes and mechanisms that facilitate ship-helicopter interface testing via man-in-the-loop ground-based flight simulators. Specifically, the DIMSS charter is to develop an accredited process for using a flight simulator to determine the wind-over-the-deck (WOD) launch and recovery flight envelope for the UH-60A ship/helicopter combination. DIMSS is a collaborative effort between the NASA Ames Research Center and OSD. OSD determines the T&E and warfighter training requirements, provides the programmatics and dynamic interface T&E experience, and conducts ship/aircraft interface tests for validating the simulation. NASA provides the research and development element, simulation facility, and simulation technical experience. This paper will highlight the benefits of the NASA/JSHIP collaboration and detail achievements of the project in terms of modeling and simulation. The Vertical Motion Simulator (VMS) at NASA Ames Research Center offers the capability to simulate a wide range of simulation cueing configurations, which include visual, aural, and body-force cueing devices. The system flexibility enables switching configurations io allow back-to-back evaluation and comparison of different levels of cueing fidelity in determining minimum training requirements. The investigation required development and integration of several major simulation system at the VMS. A new UH-60A BlackHawk interchangeable cab that provides an out-the-window (OTW) field-of-view (FOV) of 220 degrees in azimuth and 70 degrees in elevation was built. Modeling efforts involved integrating Computational Fluid Dynamics (CFD) generated data of an LHA ship airwake and

  16. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    The research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to: 1) Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation. 2) Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator. 3) Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the resultsmore » to improve understand of proppant flow and transport. 4) Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production. 5) Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include: 1) A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS, 2) Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock, 3) Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications, and 4) Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  17. Recovery Act. Development and Validation of an Advanced Stimulation Prediction Model for Enhanced Geothermal Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutierrez, Marte

    2013-12-31

    This research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to; Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation; Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator; Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the results to improve understandmore » of proppant flow and transport; Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production; and Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include; A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS; Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock; Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications; and Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less

  18. Validation of a Monte Carlo simulation of the Philips Allegro/GEMINI PET systems using GATE

    NASA Astrophysics Data System (ADS)

    Lamare, F.; Turzo, A.; Bizais, Y.; Cheze LeRest, C.; Visvikis, D.

    2006-02-01

    A newly developed simulation toolkit, GATE (Geant4 Application for Tomographic Emission), was used to develop a Monte Carlo simulation of a fully three-dimensional (3D) clinical PET scanner. The Philips Allegro/GEMINI PET systems were simulated in order to (a) allow a detailed study of the parameters affecting the system's performance under various imaging conditions, (b) study the optimization and quantitative accuracy of emission acquisition protocols for dynamic and static imaging, and (c) further validate the potential of GATE for the simulation of clinical PET systems. A model of the detection system and its geometry was developed. The accuracy of the developed detection model was tested through the comparison of simulated and measured results obtained with the Allegro/GEMINI systems for a number of NEMA NU2-2001 performance protocols including spatial resolution, sensitivity and scatter fraction. In addition, an approximate model of the system's dead time at the level of detected single events and coincidences was developed in an attempt to simulate the count rate related performance characteristics of the scanner. The developed dead-time model was assessed under different imaging conditions using the count rate loss and noise equivalent count rates performance protocols of standard and modified NEMA NU2-2001 (whole body imaging conditions) and NEMA NU2-1994 (brain imaging conditions) comparing simulated with experimental measurements obtained with the Allegro/GEMINI PET systems. Finally, a reconstructed image quality protocol was used to assess the overall performance of the developed model. An agreement of <3% was obtained in scatter fraction, with a difference between 4% and 10% in the true and random coincidence count rates respectively, throughout a range of activity concentrations and under various imaging conditions, resulting in <8% differences between simulated and measured noise equivalent count rates performance. Finally, the image quality

  19. Critical evaluation of mechanistic two-phase flow pipeline and well simulation models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dhulesia, H.; Lopez, D.

    1996-12-31

    Mechanistic steady state simulation models, rather than empirical correlations, are used for a design of multiphase production system including well, pipeline and downstream installations. Among the available models, PEPITE, WELLSIM, OLGA, TACITE and TUFFP are widely used for this purpose and consequently, a critical evaluation of these models is needed. An extensive validation methodology is proposed which consists of two distinct steps: first to validate the hydrodynamic point model using the test loop data and, then to validate the over-all simulation model using the real pipelines and wells data. The test loop databank used in this analysis contains about 5952more » data sets originated from four different test loops and a majority of these data are obtained at high pressures (up to 90 bars) with real hydrocarbon fluids. Before performing the model evaluation, physical analysis of the test loops data is required to eliminate non-coherent data. The evaluation of these point models demonstrates that the TACITE and OLGA models can be applied to any configuration of pipes. The TACITE model performs better than the OLGA model because it uses the most appropriate closure laws from the literature validated on a large number of data. The comparison of predicted and measured pressure drop for various real pipelines and wells demonstrates that the TACITE model is a reliable tool.« less

  20. Development and validation of a virtual reality simulator: human factors input to interventional radiology training.

    PubMed

    Johnson, Sheena Joanne; Guediri, Sara M; Kilkenny, Caroline; Clough, Peter J

    2011-12-01

    This study developed and validated a virtual reality (VR) simulator for use by interventional radiologists. Research in the area of skill acquisition reports practice as essential to become a task expert. Studies on simulation show skills learned in VR can be successfully transferred to a real-world task. Recently, with improvements in technology, VR simulators have been developed to allow complex medical procedures to be practiced without risking the patient. Three studies are reported. In Study I, 35 consultant interventional radiologists took part in a cognitive task analysis to empirically establish the key competencies of the Seldinger procedure. In Study 2, 62 participants performed one simulated procedure, and their performance was compared by expertise. In Study 3, the transferability of simulator training to a real-world procedure was assessed with 14 trainees. Study I produced 23 key competencies that were implemented as performance measures in the simulator. Study 2 showed the simulator had both face and construct validity, although some issues were identified. Study 3 showed the group that had undergone simulator training received significantly higher mean performance ratings on a subsequent patient procedure. The findings of this study support the centrality of validation in the successful design of simulators and show the utility of simulators as a training device. The studies show the key elements of a validation program for a simulator. In addition to task analysis and face and construct validities, the authors highlight the importance of transfer of training in validation studies.

  1. Using Numerical Modeling to Simulate Space Capsule Ground Landings

    NASA Technical Reports Server (NTRS)

    Heymsfield, Ernie; Fasanella, Edwin L.

    2009-01-01

    Experimental work is being conducted at the National Aeronautics and Space Administration s (NASA) Langley Research Center (LaRC) to investigate ground landing capabilities of the Orion crew exploration vehicle (CEV). The Orion capsule is NASA s replacement for the Space Shuttle. The Orion capsule will service the International Space Station and be used for future space missions to the Moon and to Mars. To evaluate the feasibility of Orion ground landings, a series of capsule impact tests are being performed at the NASA Langley Landing and Impact Research Facility (LandIR). The experimental results derived at LandIR provide means to validate and calibrate nonlinear dynamic finite element models, which are also being developed during this study. Because of the high cost and time involvement intrinsic to full-scale testing, numerical simulations are favored over experimental work. Subsequent to a numerical model validated by actual test responses, impact simulations will be conducted to study multiple impact scenarios not practical to test. Twenty-one swing tests using the LandIR gantry were conducted during the June 07 through October 07 time period to evaluate the Orion s impact response. Results for two capsule initial pitch angles, 0deg and -15deg , along with their computer simulations using LS-DYNA are presented in this article. A soil-vehicle friction coefficient of 0.45 was determined by comparing the test stopping distance with computer simulations. In addition, soil modeling accuracy is presented by comparing vertical penetrometer impact tests with computer simulations for the soil model used during the swing tests.

  2. Simulation on Poisson and negative binomial models of count road accident modeling

    NASA Astrophysics Data System (ADS)

    Sapuan, M. S.; Razali, A. M.; Zamzuri, Z. H.; Ibrahim, K.

    2016-11-01

    Accident count data have often been shown to have overdispersion. On the other hand, the data might contain zero count (excess zeros). The simulation study was conducted to create a scenarios which an accident happen in T-junction with the assumption the dependent variables of generated data follows certain distribution namely Poisson and negative binomial distribution with different sample size of n=30 to n=500. The study objective was accomplished by fitting Poisson regression, negative binomial regression and Hurdle negative binomial model to the simulated data. The model validation was compared and the simulation result shows for each different sample size, not all model fit the data nicely even though the data generated from its own distribution especially when the sample size is larger. Furthermore, the larger sample size indicates that more zeros accident count in the dataset.

  3. Development and validation of P-MODTRAN7 and P-MCScene, 1D and 3D polarimetric radiative transfer models

    NASA Astrophysics Data System (ADS)

    Hawes, Frederick T.; Berk, Alexander; Richtsmeier, Steven C.

    2016-05-01

    A validated, polarimetric 3-dimensional simulation capability, P-MCScene, is being developed by generalizing Spectral Sciences' Monte Carlo-based synthetic scene simulation model, MCScene, to include calculation of all 4 Stokes components. P-MCScene polarimetric optical databases will be generated by a new version (MODTRAN7) of the government-standard MODTRAN radiative transfer algorithm. The conversion of MODTRAN6 to a polarimetric model is being accomplished by (1) introducing polarimetric data, by (2) vectorizing the MODTRAN radiation calculations and by (3) integrating the newly revised and validated vector discrete ordinate model VDISORT3. Early results, presented here, demonstrate a clear pathway to the long-term goal of fully validated polarimetric models.

  4. Integrated Turbine-Based Combined Cycle Dynamic Simulation Model

    NASA Technical Reports Server (NTRS)

    Haid, Daniel A.; Gamble, Eric J.

    2011-01-01

    A Turbine-Based Combined Cycle (TBCC) dynamic simulation model has been developed to demonstrate all modes of operation, including mode transition, for a turbine-based combined cycle propulsion system. The High Mach Transient Engine Cycle Code (HiTECC) is a highly integrated tool comprised of modules for modeling each of the TBCC systems whose interactions and controllability affect the TBCC propulsion system thrust and operability during its modes of operation. By structuring the simulation modeling tools around the major TBCC functional modes of operation (Dry Turbojet, Afterburning Turbojet, Transition, and Dual Mode Scramjet) the TBCC mode transition and all necessary intermediate events over its entire mission may be developed, modeled, and validated. The reported work details the use of the completed model to simulate a TBCC propulsion system as it accelerates from Mach 2.5, through mode transition, to Mach 7. The completion of this model and its subsequent use to simulate TBCC mode transition significantly extends the state-of-the-art for all TBCC modes of operation by providing a numerical simulation of the systems, interactions, and transient responses affecting the ability of the propulsion system to transition from turbine-based to ramjet/scramjet-based propulsion while maintaining constant thrust.

  5. A mathematical simulation model of the CH-47B helicopter, volume 2

    NASA Technical Reports Server (NTRS)

    Weber, J. M.; Liu, T. Y.; Chung, W.

    1984-01-01

    A nonlinear simulation model of the CH-47B helicopter, was adapted for use in a simulation facility. The model represents the specific configuration of the variable stability CH-47B helicopter. Modeling of the helicopter uses a total force approach in six rigid body degrees of freedom. Rotor dynamics are simulated using the Wheatley-Bailey equations, steady state flapping dynamics and included in the model of the option for simulation of external suspension, slung load equations of motion. Validation of the model was accomplished by static and dynamic data from the original Boeing Vertol mathematical model and flight test data. The model is appropriate for use in real time piloted simulation and is implemented on the ARC Sigma IX computer where it may be operated with a digital cycle time of 0.03 sec.

  6. Common modeling system for digital simulation

    NASA Technical Reports Server (NTRS)

    Painter, Rick

    1994-01-01

    The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.

  7. Validation of educational assessments: a primer for simulation and beyond.

    PubMed

    Cook, David A; Hatala, Rose

    2016-01-01

    Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches and topics. Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results. Contemporary frameworks view validity as a hypothesis, and validity evidence is collected to support or refute the validity hypothesis (i.e., that the proposed interpretations and decisions are defensible). In validation, the educator or researcher defines the proposed interpretations and decisions, identifies and prioritizes the most questionable assumptions in making these interpretations and decisions (the "interpretation-use argument"), empirically tests those assumptions using existing or newly-collected evidence, and then summarizes the evidence as a coherent "validity argument." A framework proposed by Messick identifies potential evidence sources: content, response process, internal structure, relationships with other variables, and consequences. Another framework proposed by Kane identifies key inferences in generating useful interpretations: scoring, generalization, extrapolation, and implications/decision. We propose an eight-step approach to validation that applies to either framework: Define the construct and proposed interpretation, make explicit the intended decision(s), define the interpretation-use argument and prioritize needed validity evidence, identify candidate instruments and/or create/adapt a new instrument, appraise existing evidence and collect new evidence as needed, keep track of practical issues, formulate the validity argument, and make a judgment: does the evidence support the intended use? Rigorous validation first prioritizes and then empirically evaluates key

  8. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  9. ADVANCED UTILITY SIMULATION MODEL, REPORT OF SENSITIVITY TESTING, CALIBRATION, AND MODEL OUTPUT COMPARISONS (VERSION 3.0)

    EPA Science Inventory

    The report gives results of activities relating to the Advanced Utility Simulation Model (AUSM): sensitivity testing. comparison with a mature electric utility model, and calibration to historical emissions. The activities were aimed at demonstrating AUSM's validity over input va...

  10. Simulation-based training for prostate surgery.

    PubMed

    Khan, Raheej; Aydin, Abdullatif; Khan, Muhammad Shamim; Dasgupta, Prokar; Ahmed, Kamran

    2015-10-01

    To identify and review the currently available simulators for prostate surgery and to explore the evidence supporting their validity for training purposes. A review of the literature between 1999 and 2014 was performed. The search terms included a combination of urology, prostate surgery, robotic prostatectomy, laparoscopic prostatectomy, transurethral resection of the prostate (TURP), simulation, virtual reality, animal model, human cadavers, training, assessment, technical skills, validation and learning curves. Furthermore, relevant abstracts from the American Urological Association, European Association of Urology, British Association of Urological Surgeons and World Congress of Endourology meetings, between 1999 and 2013, were included. Only studies related to prostate surgery simulators were included; studies regarding other urological simulators were excluded. A total of 22 studies that carried out a validation study were identified. Five validated models and/or simulators were identified for TURP, one for photoselective vaporisation of the prostate, two for holmium enucleation of the prostate, three for laparoscopic radical prostatectomy (LRP) and four for robot-assisted surgery. Of the TURP simulators, all five have demonstrated content validity, three face validity and four construct validity. The GreenLight laser simulator has demonstrated face, content and construct validities. The Kansai HoLEP Simulator has demonstrated face and content validity whilst the UroSim HoLEP Simulator has demonstrated face, content and construct validity. All three animal models for LRP have been shown to have construct validity whilst the chicken skin model was also content valid. Only two robotic simulators were identified with relevance to robot-assisted laparoscopic prostatectomy, both of which demonstrated construct validity. A wide range of different simulators are available for prostate surgery, including synthetic bench models, virtual-reality platforms, animal

  11. Thermo-Oxidative Induced Damage in Polymer Composites: Microstructure Image-Based Multi-Scale Modeling and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Hussein, Rafid M.; Chandrashekhara, K.

    2017-11-01

    A multi-scale modeling approach is presented to simulate and validate thermo-oxidation shrinkage and cracking damage of a high temperature polymer composite. The multi-scale approach investigates coupled transient diffusion-reaction and static structural at macro- to micro-scale. The micro-scale shrinkage deformation and cracking damage are simulated and validated using 2D and 3D simulations. Localized shrinkage displacement boundary conditions for the micro-scale simulations are determined from the respective meso- and macro-scale simulations, conducted for a cross-ply laminate. The meso-scale geometrical domain and the micro-scale geometry and mesh are developed using the object oriented finite element (OOF). The macro-scale shrinkage and weight loss are measured using unidirectional coupons and used to build the macro-shrinkage model. The cross-ply coupons are used to validate the macro-shrinkage model by the shrinkage profiles acquired using scanning electron images at the cracked surface. The macro-shrinkage model deformation shows a discrepancy when the micro-scale image-based cracking is computed. The local maximum shrinkage strain is assumed to be 13 times the maximum macro-shrinkage strain of 2.5 × 10-5, upon which the discrepancy is minimized. The microcrack damage of the composite is modeled using a static elastic analysis with extended finite element and cohesive surfaces by considering the modulus spatial evolution. The 3D shrinkage displacements are fed to the model using node-wise boundary/domain conditions of the respective oxidized region. Microcrack simulation results: length, meander, and opening are closely matched to the crack in the area of interest for the scanning electron images.

  12. Predictive modeling of infrared radiative heating in tomato dry-peeling process: Part II. Model validation and sensitivity analysis

    USDA-ARS?s Scientific Manuscript database

    A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...

  13. Modelling and simulation of wood chip combustion in a hot air generator system.

    PubMed

    Rajika, J K A T; Narayana, Mahinsasa

    2016-01-01

    This study focuses on modelling and simulation of horizontal moving bed/grate wood chip combustor. A standalone finite volume based 2-D steady state Euler-Euler Computational Fluid Dynamics (CFD) model was developed for packed bed combustion. Packed bed combustion of a medium scale biomass combustor, which was retrofitted from wood log to wood chip feeding for Tea drying in Sri Lanka, was evaluated by a CFD simulation study. The model was validated by the experimental results of an industrial biomass combustor for a hot air generation system in tea industry. Open-source CFD tool; OpenFOAM was used to generate CFD model source code for the packed bed combustion and simulated along with an available solver for free board region modelling in the CFD tool. Height of the packed bed is about 20 cm and biomass particles are assumed to be spherical shape with constant surface area to volume ratio. Temperature measurements of the combustor are well agreed with simulation results while gas phase compositions have discrepancies. Combustion efficiency of the validated hot air generator is around 52.2 %.

  14. Modeling and numerical simulations of the influenced Sznajd model

    NASA Astrophysics Data System (ADS)

    Karan, Farshad Salimi Naneh; Srinivasan, Aravinda Ramakrishnan; Chakraborty, Subhadeep

    2017-08-01

    This paper investigates the effects of independent nonconformists or influencers on the behavioral dynamic of a population of agents interacting with each other based on the Sznajd model. The system is modeled on a complete graph using the master equation. The acquired equation has been numerically solved. Accuracy of the mathematical model and its corresponding assumptions have been validated by numerical simulations. Regions of initial magnetization have been found from where the system converges to one of two unique steady-state PDFs, depending on the distribution of influencers. The scaling property and entropy of the stationary system in presence of varying level of influence have been presented and discussed.

  15. Modeling and numerical simulations of the influenced Sznajd model.

    PubMed

    Karan, Farshad Salimi Naneh; Srinivasan, Aravinda Ramakrishnan; Chakraborty, Subhadeep

    2017-08-01

    This paper investigates the effects of independent nonconformists or influencers on the behavioral dynamic of a population of agents interacting with each other based on the Sznajd model. The system is modeled on a complete graph using the master equation. The acquired equation has been numerically solved. Accuracy of the mathematical model and its corresponding assumptions have been validated by numerical simulations. Regions of initial magnetization have been found from where the system converges to one of two unique steady-state PDFs, depending on the distribution of influencers. The scaling property and entropy of the stationary system in presence of varying level of influence have been presented and discussed.

  16. Twitter's tweet method modelling and simulation

    NASA Astrophysics Data System (ADS)

    Sarlis, Apostolos S.; Sakas, Damianos P.; Vlachos, D. S.

    2015-02-01

    This paper seeks to purpose the concept of Twitter marketing methods. The tools that Twitter provides are modelled and simulated using iThink in the context of a Twitter media-marketing agency. The paper has leveraged the system's dynamic paradigm to conduct Facebook marketing tools and methods modelling, using iThink™ system to implement them. It uses the design science research methodology for the proof of concept of the models and modelling processes. The following models have been developed for a twitter marketing agent/company and tested in real circumstances and with real numbers. These models were finalized through a number of revisions and iterators of the design, develop, simulate, test and evaluate. It also addresses these methods that suit most organized promotion through targeting, to the Twitter social media service. The validity and usefulness of these Twitter marketing methods models for the day-to-day decision making are authenticated by the management of the company organization. It implements system dynamics concepts of Twitter marketing methods modelling and produce models of various Twitter marketing situations. The Tweet method that Twitter provides can be adjusted, depending on the situation, in order to maximize the profit of the company/agent.

  17. Volcano Modelling and Simulation gateway (VMSg): A new web-based framework for collaborative research in physical modelling and simulation of volcanic phenomena

    NASA Astrophysics Data System (ADS)

    Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.

    2009-12-01

    Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.

  18. Rationality Validation of a Layered Decision Model for Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Huaqiang; Alves-Foss, James; Zhang, Du

    2007-08-31

    We propose a cost-effective network defense strategy built on three key: three decision layers: security policies, defense strategies, and real-time defense tactics for countering immediate threats. A layered decision model (LDM) can be used to capture this decision process. The LDM helps decision-makers gain insight into the hierarchical relationships among inter-connected entities and decision types, and supports the selection of cost-effective defense mechanisms to safeguard computer networks. To be effective as a business tool, it is first necessary to validate the rationality of model before applying it to real-world business cases. This paper describes our efforts in validating the LDMmore » rationality through simulation.« less

  19. Building the evidence on simulation validity: comparison of anesthesiologists' communication patterns in real and simulated cases.

    PubMed

    Weller, Jennifer; Henderson, Robert; Webster, Craig S; Shulruf, Boaz; Torrie, Jane; Davies, Elaine; Henderson, Kaylene; Frampton, Chris; Merry, Alan F

    2014-01-01

    Effective teamwork is important for patient safety, and verbal communication underpins many dimensions of teamwork. The validity of the simulated environment would be supported if it elicited similar verbal communications to the real setting. The authors hypothesized that anesthesiologists would exhibit similar verbal communication patterns in routine operating room (OR) cases and routine simulated cases. The authors further hypothesized that anesthesiologists would exhibit different communication patterns in routine cases (real or simulated) and simulated cases involving a crisis. Key communications relevant to teamwork were coded from video recordings of anesthesiologists in the OR, routine simulation and crisis simulation and percentages were compared. The authors recorded comparable videos of 20 anesthesiologists in the two simulations, and 17 of these anesthesiologists in the OR, generating 400 coded events in the OR, 683 in the routine simulation, and 1,419 in the crisis simulation. The authors found no significant differences in communication patterns in the OR and the routine simulations. The authors did find significant differences in communication patterns between the crisis simulation and both the OR and the routine simulations. Participants rated team communication as realistic and considered their communications occurred with a similar frequency in the simulations as in comparable cases in the OR. The similarity of teamwork-related communications elicited from anesthesiologists in simulated cases and the real setting lends support for the ecological validity of the simulation environment and its value in teamwork training. Different communication patterns and frequencies under the challenge of a crisis support the use of simulation to assess crisis management skills.

  20. The development, validation and application of a multi-detector CT (MDCT) scanner model for assessing organ doses to the pregnant patient and the fetus using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Gu, J.; Bednarz, B.; Caracappa, P. F.; Xu, X. G.

    2009-05-01

    The latest multiple-detector technologies have further increased the popularity of x-ray CT as a diagnostic imaging modality. There is a continuing need to assess the potential radiation risk associated with such rapidly evolving multi-detector CT (MDCT) modalities and scanning protocols. This need can be met by the use of CT source models that are integrated with patient computational phantoms for organ dose calculations. Based on this purpose, this work developed and validated an MDCT scanner using the Monte Carlo method, and meanwhile the pregnant patient phantoms were integrated into the MDCT scanner model for assessment of the dose to the fetus as well as doses to the organs or tissues of the pregnant patient phantom. A Monte Carlo code, MCNPX, was used to simulate the x-ray source including the energy spectrum, filter and scan trajectory. Detailed CT scanner components were specified using an iterative trial-and-error procedure for a GE LightSpeed CT scanner. The scanner model was validated by comparing simulated results against measured CTDI values and dose profiles reported in the literature. The source movement along the helical trajectory was simulated using the pitch of 0.9375 and 1.375, respectively. The validated scanner model was then integrated with phantoms of a pregnant patient in three different gestational periods to calculate organ doses. It was found that the dose to the fetus of the 3 month pregnant patient phantom was 0.13 mGy/100 mAs and 0.57 mGy/100 mAs from the chest and kidney scan, respectively. For the chest scan of the 6 month patient phantom and the 9 month patient phantom, the fetal doses were 0.21 mGy/100 mAs and 0.26 mGy/100 mAs, respectively. The paper also discusses how these fetal dose values can be used to evaluate imaging procedures and to assess risk using recommendations of the report from AAPM Task Group 36. This work demonstrates the ability of modeling and validating an MDCT scanner by the Monte Carlo method, as well as

  1. A Method of Q-Matrix Validation for the Linear Logistic Test Model

    PubMed Central

    Baghaei, Purya; Hohensinn, Christine

    2017-01-01

    The linear logistic test model (LLTM) is a well-recognized psychometric model for examining the components of difficulty in cognitive tests and validating construct theories. The plausibility of the construct model, summarized in a matrix of weights, known as the Q-matrix or weight matrix, is tested by (1) comparing the fit of LLTM with the fit of the Rasch model (RM) using the likelihood ratio (LR) test and (2) by examining the correlation between the Rasch model item parameters and LLTM reconstructed item parameters. The problem with the LR test is that it is almost always significant and, consequently, LLTM is rejected. The drawback of examining the correlation coefficient is that there is no cut-off value or lower bound for the magnitude of the correlation coefficient. In this article we suggest a simulation method to set a minimum benchmark for the correlation between item parameters from the Rasch model and those reconstructed by the LLTM. If the cognitive model is valid then the correlation coefficient between the RM-based item parameters and the LLTM-reconstructed item parameters derived from the theoretical weight matrix should be greater than those derived from the simulated matrices. PMID:28611721

  2. Numerical modeling and preliminary validation of drag-based vertical axis wind turbine

    NASA Astrophysics Data System (ADS)

    Krysiński, Tomasz; Buliński, Zbigniew; Nowak, Andrzej J.

    2015-03-01

    The main purpose of this article is to verify and validate the mathematical description of the airflow around a wind turbine with vertical axis of rotation, which could be considered as representative for this type of devices. Mathematical modeling of the airflow around wind turbines in particular those with the vertical axis is a problematic matter due to the complex nature of this highly swirled flow. Moreover, it is turbulent flow accompanied by a rotation of the rotor and the dynamic boundary layer separation. In such conditions, the key aspects of the mathematical model are accurate turbulence description, definition of circular motion as well as accompanying effects like centrifugal force or the Coriolis force and parameters of spatial and temporal discretization. The paper presents the impact of the different simulation parameters on the obtained results of the wind turbine simulation. Analysed models have been validated against experimental data published in the literature.

  3. Interfacial characteristics of propylene carbonate and validation of simulation models for electrochemical applications

    NASA Astrophysics Data System (ADS)

    You, Xinli

    Supercapacitors have occupy an indispensable role in today's energy storage systems due to their high power density and long life. The introduction of car- bon nanotube (CNT) forests as electrode offers the possibility of nano-scale design and high capacitance. We have performed molecular dynamics simulations on a CNT forest-based electrochemical double-layer capacitor (EDLC) and a widely used electrolyte solution (tetra-ethylammonium tetra-fluoroborate in propylene carbonate, TEABF4 /PC). We compare corresponding primitive model and atomically detailed model of TEABF4 /P, emphasizing the significance of ion clustering in electrolytes. The molecular dynamic simulation results suggests that the arrangement of closest neigh- bors leads to the formation of cation-anion chains or rings. Fuoss's discussion of ion-pairing model provides the approximation for a primitive model of 1-1 electrolyte is not broadly satisfactory for both primitive and atomically detailed cases. A more general Poisson statistical assumption is shown to be satisfactory when coordina- tion numbers are low, as is likely to be the case when ion-pairing initiates. We examined the Poisson-based model over a range of concentrations for both models of TEABF4 /P, and the atomically detailed model results identified solvent-separated nearest-neighbor ion-pairs. Large surface areas plays an essential role in nanomaterial properties, which calls for an accurate description of interfaces through modeling. We studied propylene carbonate, a widely used solvent in EDLC systems. PC wets graphite with a contact angle of 31°. The MD simulation model reproduced this contact angle after reduction 40% of the strength of graphite-C atom Lennard-Jones interactions with the solvent. The critical temperature of PC was accurately evaluated by extrapolating the PC liquid-vapor surface tensions. PC molecules tend to lie flat on the PC liquid-vapor surface, and project the propyl carbon toward the vapor phase. Liquid PC

  4. Design and validation of diffusion MRI models of white matter

    NASA Astrophysics Data System (ADS)

    Jelescu, Ileana O.; Budde, Matthew D.

    2017-11-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  5. Design and validation of diffusion MRI models of white matter

    PubMed Central

    Jelescu, Ileana O.; Budde, Matthew D.

    2018-01-01

    Diffusion MRI is arguably the method of choice for characterizing white matter microstructure in vivo. Over the typical duration of diffusion encoding, the displacement of water molecules is conveniently on a length scale similar to that of the underlying cellular structures. Moreover, water molecules in white matter are largely compartmentalized which enables biologically-inspired compartmental diffusion models to characterize and quantify the true biological microstructure. A plethora of white matter models have been proposed. However, overparameterization and mathematical fitting complications encourage the introduction of simplifying assumptions that vary between different approaches. These choices impact the quantitative estimation of model parameters with potential detriments to their biological accuracy and promised specificity. First, we review biophysical white matter models in use and recapitulate their underlying assumptions and realms of applicability. Second, we present up-to-date efforts to validate parameters estimated from biophysical models. Simulations and dedicated phantoms are useful in assessing the performance of models when the ground truth is known. However, the biggest challenge remains the validation of the “biological accuracy” of estimated parameters. Complementary techniques such as microscopy of fixed tissue specimens have facilitated direct comparisons of estimates of white matter fiber orientation and densities. However, validation of compartmental diffusivities remains challenging, and complementary MRI-based techniques such as alternative diffusion encodings, compartment-specific contrast agents and metabolites have been used to validate diffusion models. Finally, white matter injury and disease pose additional challenges to modeling, which are also discussed. This review aims to provide an overview of the current state of models and their validation and to stimulate further research in the field to solve the remaining open

  6. Analysis of a DNA simulation model through hairpin melting experiments.

    PubMed

    Linak, Margaret C; Dorfman, Kevin D

    2010-09-28

    We compare the predictions of a two-bead Brownian dynamics simulation model to melting experiments of DNA hairpins with complementary AT or GC stems and noninteracting loops in buffer A. This system emphasizes the role of stacking and hydrogen bonding energies, which are characteristics of DNA, rather than backbone bending, stiffness, and excluded volume interactions, which are generic characteristics of semiflexible polymers. By comparing high throughput data on the open-close transition of various DNA hairpins to the corresponding simulation data, we (1) establish a suitable metric to compare the simulations to experiments, (2) find a conversion between the simulation and experimental temperatures, and (3) point out several limitations of the model, including the lack of G-quartets and cross stacking effects. Our approach and experimental data can be used to validate similar coarse-grained simulation models.

  7. A Note on Verification of Computer Simulation Models

    ERIC Educational Resources Information Center

    Aigner, Dennis J.

    1972-01-01

    Establishes an argument that questions the validity of one test'' of goodness-of-fit (the extent to which a series of obtained measures agrees with a series of theoretical measures) for the simulated time path of a simple endogenous (internally developed) variable in a simultaneous, perhaps dynamic econometric model. (Author)

  8. Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.

    2016-01-01

    Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.

  9. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    NASA Technical Reports Server (NTRS)

    Robinson, Tyler D.; Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard; Hearty, Thomas; hide

    2011-01-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole disk Earth model simulations used to better under- stand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute s Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model (Tinetti et al., 2006a,b). This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of approx.100 pixels on the visible disk, and four categories of water clouds, which were defined using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to the Earth s lightcurve, absolute brightness, and spectral data, with a root-mean-square error of typically less than 3% for the multiwavelength lightcurves, and residuals of approx.10% for the absolute brightness throughout the visible and NIR spectral range. We extend our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of approx.7%, and temperature errors of less than 1K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated

  10. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    NASA Astrophysics Data System (ADS)

    Robinson, Tyler D.; Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard K.; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M.; McFadden, Lucy A.; Wellnitz, Dennis D.

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  11. Earth as an extrasolar planet: Earth model validation using EPOXI earth observations.

    PubMed

    Robinson, Tyler D; Meadows, Victoria S; Crisp, David; Deming, Drake; A'hearn, Michael F; Charbonneau, David; Livengood, Timothy A; Seager, Sara; Barry, Richard K; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M; McFadden, Lucy A; Wellnitz, Dennis D

    2011-06-01

    The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward model can be

  12. Earth as an Extrasolar Planet: Earth Model Validation Using EPOXI Earth Observations

    PubMed Central

    Meadows, Victoria S.; Crisp, David; Deming, Drake; A'Hearn, Michael F.; Charbonneau, David; Livengood, Timothy A.; Seager, Sara; Barry, Richard K.; Hearty, Thomas; Hewagama, Tilak; Lisse, Carey M.; McFadden, Lucy A.; Wellnitz, Dennis D.

    2011-01-01

    Abstract The EPOXI Discovery Mission of Opportunity reused the Deep Impact flyby spacecraft to obtain spatially and temporally resolved visible photometric and moderate resolution near-infrared (NIR) spectroscopic observations of Earth. These remote observations provide a rigorous validation of whole-disk Earth model simulations used to better understand remotely detectable extrasolar planet characteristics. We have used these data to upgrade, correct, and validate the NASA Astrobiology Institute's Virtual Planetary Laboratory three-dimensional line-by-line, multiple-scattering spectral Earth model. This comprehensive model now includes specular reflectance from the ocean and explicitly includes atmospheric effects such as Rayleigh scattering, gas absorption, and temperature structure. We have used this model to generate spatially and temporally resolved synthetic spectra and images of Earth for the dates of EPOXI observation. Model parameters were varied to yield an optimum fit to the data. We found that a minimum spatial resolution of ∼100 pixels on the visible disk, and four categories of water clouds, which were defined by using observed cloud positions and optical thicknesses, were needed to yield acceptable fits. The validated model provides a simultaneous fit to Earth's lightcurve, absolute brightness, and spectral data, with a root-mean-square (RMS) error of typically less than 3% for the multiwavelength lightcurves and residuals of ∼10% for the absolute brightness throughout the visible and NIR spectral range. We have extended our validation into the mid-infrared by comparing the model to high spectral resolution observations of Earth from the Atmospheric Infrared Sounder, obtaining a fit with residuals of ∼7% and brightness temperature errors of less than 1 K in the atmospheric window. For the purpose of understanding the observable characteristics of the distant Earth at arbitrary viewing geometry and observing cadence, our validated forward

  13. One-month validation of the Space Weather Modeling Framework geospace model

    NASA Astrophysics Data System (ADS)

    Haiducek, J. D.; Welling, D. T.; Ganushkina, N. Y.; Morley, S.; Ozturk, D. S.

    2017-12-01

    The Space Weather Modeling Framework (SWMF) geospace model consists of a magnetohydrodynamic (MHD) simulation coupled to an inner magnetosphere model and an ionosphere model. This provides a predictive capability for magnetopsheric dynamics, including ground-based and space-based magnetic fields, geomagnetic indices, currents and densities throughout the magnetosphere, cross-polar cap potential, and magnetopause and bow shock locations. The only inputs are solar wind parameters and F10.7 radio flux. We have conducted a rigorous validation effort consisting of a continuous simulation covering the month of January, 2005 using three different model configurations. This provides a relatively large dataset for assessment of the model's predictive capabilities. We find that the model does an excellent job of predicting the Sym-H index, and performs well at predicting Kp and CPCP during active times. Dayside magnetopause and bow shock positions are also well predicted. The model tends to over-predict Kp and CPCP during quiet times and under-predicts the magnitude of AL during disturbances. The model under-predicts the magnitude of night-side geosynchronous Bz, and over-predicts the radial distance to the flank magnetopause and bow shock. This suggests that the model over-predicts stretching of the magnetotail and the overall size of the magnetotail. With the exception of the AL index and the nightside geosynchronous magnetic field, we find the results to be insensitive to grid resolution.

  14. An accurate behavioral model for single-photon avalanche diode statistical performance simulation

    NASA Astrophysics Data System (ADS)

    Xu, Yue; Zhao, Tingchen; Li, Ding

    2018-01-01

    An accurate behavioral model is presented to simulate important statistical performance of single-photon avalanche diodes (SPADs), such as dark count and after-pulsing noise. The derived simulation model takes into account all important generation mechanisms of the two kinds of noise. For the first time, thermal agitation, trap-assisted tunneling and band-to-band tunneling mechanisms are simultaneously incorporated in the simulation model to evaluate dark count behavior of SPADs fabricated in deep sub-micron CMOS technology. Meanwhile, a complete carrier trapping and de-trapping process is considered in afterpulsing model and a simple analytical expression is derived to estimate after-pulsing probability. In particular, the key model parameters of avalanche triggering probability and electric field dependence of excess bias voltage are extracted from Geiger-mode TCAD simulation and this behavioral simulation model doesn't include any empirical parameters. The developed SPAD model is implemented in Verilog-A behavioral hardware description language and successfully operated on commercial Cadence Spectre simulator, showing good universality and compatibility. The model simulation results are in a good accordance with the test data, validating high simulation accuracy.

  15. Experimental Validation of a Closed Brayton Cycle System Transient Simulation

    NASA Technical Reports Server (NTRS)

    Johnson, Paul K.; Hervol, David S.

    2006-01-01

    The Brayton Power Conversion Unit (BPCU) is a closed cycle system with an inert gas working fluid. It is located in Vacuum Facility 6 at NASA Glenn Research Center. Was used in previous solar dynamic technology efforts (SDGTD). Modified to its present configuration by replacing the solar receiver with an electrical resistance heater. The first closed-Brayton-cycle to be coupled with an ion propulsion system. Used to examine mechanical dynamic characteristics and responses. The focus of this work was the validation of a computer model of the BPCU. Model was built using the Closed Cycle System Simulation (CCSS) design and analysis tool. Test conditions were then duplicated in CCSS. Various steady-state points. Transients involving changes in shaft rotational speed and heat input. Testing to date has shown that the BPCU is able to generate meaningful, repeatable data that can be used for computer model validation. Results generated by CCSS demonstrated that the model sufficiently reproduced the thermal transients exhibited by the BPCU system. CCSS was also used to match BPCU steady-state operating points. Cycle temperatures were within 4.1% of the data (most were within 1%). Cycle pressures were all within 3.2%. Error in alternator power (as much as 13.5%) was attributed to uncertainties in the compressor and turbine maps and alternator and bearing loss models. The acquired understanding of the BPCU behavior gives useful insight for improvements to be made to the CCSS model as well as ideas for future testing and possible system modifications.

  16. Molecular simulation and experimental validation of resorcinol adsorption on Ordered Mesoporous Carbon (OMC).

    PubMed

    Ahmad, Zaki Uddin; Chao, Bing; Konggidinata, Mas Iwan; Lian, Qiyu; Zappi, Mark E; Gang, Daniel Dianchen

    2018-04-27

    Numerous research works have been devoted in the adsorption area using experimental approaches. All these approaches are based on trial and error process and extremely time consuming. Molecular simulation technique is a new tool that can be used to design and predict the performance of an adsorbent. This research proposed a simulation technique that can greatly reduce the time in designing the adsorbent. In this study, a new Rhombic ordered mesoporous carbon (OMC) model is proposed and constructed with various pore sizes and oxygen contents using Materials Visualizer Module to optimize the structure of OMC for resorcinol adsorption. The specific surface area, pore volume, small angle X-ray diffraction pattern, and resorcinol adsorption capacity were calculated by Forcite and Sorption module in Materials Studio Package. The simulation results were validated experimentally through synthesizing OMC with different pore sizes and oxygen contents prepared via hard template method employing SBA-15 silica scaffold. Boric acid was used as the pore expanding reagent to synthesize OMC with different pore sizes (from 4.6 to 11.3 nm) and varying oxygen contents (from 11.9% to 17.8%). Based on the simulation and experimental validation, the optimal pore size was found to be 6 nm for maximum adsorption of resorcinol. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Simulated Driving Assessment (SDA) for Teen Drivers: Results from a Validation Study

    PubMed Central

    McDonald, Catherine C.; Kandadai, Venk; Loeb, Helen; Seacrist, Thomas S.; Lee, Yi-Ching; Winston, Zachary; Winston, Flaura K.

    2015-01-01

    Background Driver error and inadequate skill are common critical reasons for novice teen driver crashes, yet few validated, standardized assessments of teen driving skills exist. The purpose of this study was to evaluate the construct and criterion validity of a newly developed Simulated Driving Assessment (SDA) for novice teen drivers. Methods The SDA's 35-minute simulated drive incorporates 22 variations of the most common teen driver crash configurations. Driving performance was compared for 21 inexperienced teens (age 16–17 years, provisional license ≤90 days) and 17 experienced adults (age 25–50 years, license ≥5 years, drove ≥100 miles per week, no collisions or moving violations ≤3 years). SDA driving performance (Error Score) was based on driving safety measures derived from simulator and eye-tracking data. Negative driving outcomes included simulated collisions or run-off-the-road incidents. A professional driving evaluator/instructor reviewed videos of SDA performance (DEI Score). Results The SDA demonstrated construct validity: 1.) Teens had a higher Error Score than adults (30 vs. 13, p=0.02); 2.) For each additional error committed, the relative risk of a participant's propensity for a simulated negative driving outcome increased by 8% (95% CI: 1.05–1.10, p<0.01). The SDA demonstrated criterion validity: Error Score was correlated with DEI Score (r=−0.66, p<0.001). Conclusions This study supports the concept of validated simulated driving tests like the SDA to assess novice driver skill in complex and hazardous driving scenarios. The SDA, as a standard protocol to evaluate teen driver performance, has the potential to facilitate screening and assessment of teen driving readiness and could be used to guide targeted skill training. PMID:25740939

  18. Simulated Driving Assessment (SDA) for teen drivers: results from a validation study.

    PubMed

    McDonald, Catherine C; Kandadai, Venk; Loeb, Helen; Seacrist, Thomas S; Lee, Yi-Ching; Winston, Zachary; Winston, Flaura K

    2015-06-01

    Driver error and inadequate skill are common critical reasons for novice teen driver crashes, yet few validated, standardised assessments of teen driving skills exist. The purpose of this study is to evaluate the construct and criterion validity of a newly developed Simulated Driving Assessment (SDA) for novice teen drivers. The SDA's 35 min simulated drive incorporates 22 variations of the most common teen driver crash configurations. Driving performance was compared for 21 inexperienced teens (age 16-17 years, provisional license ≤90 days) and 17 experienced adults (age 25-50 years, license ≥5 years, drove ≥100 miles per week, no collisions or moving violations ≤3 years). SDA driving performance (Error Score) was based on driving safety measures derived from simulator and eye-tracking data. Negative driving outcomes included simulated collisions or run-off-the-road incidents. A professional driving evaluator/instructor (DEI Score) reviewed videos of SDA performance. The SDA demonstrated construct validity: (1) teens had a higher Error Score than adults (30 vs. 13, p=0.02); (2) For each additional error committed, the RR of a participant's propensity for a simulated negative driving outcome increased by 8% (95% CI 1.05 to 1.10, p<0.01). The SDA-demonstrated criterion validity: Error Score was correlated with DEI Score (r=-0.66, p<0.001). This study supports the concept of validated simulated driving tests like the SDA to assess novice driver skill in complex and hazardous driving scenarios. The SDA, as a standard protocol to evaluate teen driver performance, has the potential to facilitate screening and assessment of teen driving readiness and could be used to guide targeted skill training. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  19. Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads

    DOE PAGES

    Moon, Jae; Manuel, Lance; Churchfield, Matthew; ...

    2017-12-28

    Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC) standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES). Stochastic characteristics of these LES waked wind velocity field,more » including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR) with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study's overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.« less

  20. Validation of a FAST model of the Statoil-Hywind Demo floating wind turbine

    DOE PAGES

    Driscoll, Frederick; Jonkman, Jason; Robertson, Amy; ...

    2016-10-13

    To assess the accuracy of the National Renewable Energy Laboratory's (NREL's) FAST simulation tool for modeling the coupled response of floating offshore wind turbines under realistic open-ocean conditions, NREL developed a FAST model of the Statoil Hywind Demo floating offshore wind turbine, and validated simulation results against field measurements. Field data were provided by Statoil, which conducted a comprehensive test measurement campaign of its demonstration system, a 2.3-MW Siemens turbine mounted on a spar substructure deployed about 10 km off the island of Karmoy in Norway. A top-down approach was used to develop the FAST model, starting with modeling themore » blades and working down to the mooring system. Design data provided by Siemens and Statoil were used to specify the structural, aerodynamic, and dynamic properties. Measured wind speeds and wave spectra were used to develop the wind and wave conditions used in the model. The overall system performance and behavior were validated for eight sets of field measurements that span a wide range of operating conditions. The simulated controller response accurately reproduced the measured blade pitch and power. In conclusion, the structural and blade loads and spectra of platform motion agree well with the measured data.« less

  1. Toward Development of a Stochastic Wake Model: Validation Using LES and Turbine Loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moon, Jae; Manuel, Lance; Churchfield, Matthew

    Wind turbines within an array do not experience free-stream undisturbed flow fields. Rather, the flow fields on internal turbines are influenced by wakes generated by upwind unit and exhibit different dynamic characteristics relative to the free stream. The International Electrotechnical Commission (IEC) standard 61400-1 for the design of wind turbines only considers a deterministic wake model for the design of a wind plant. This study is focused on the development of a stochastic model for waked wind fields. First, high-fidelity physics-based waked wind velocity fields are generated using Large-Eddy Simulation (LES). Stochastic characteristics of these LES waked wind velocity field,more » including mean and turbulence components, are analyzed. Wake-related mean and turbulence field-related parameters are then estimated for use with a stochastic model, using Multivariate Multiple Linear Regression (MMLR) with the LES data. To validate the simulated wind fields based on the stochastic model, wind turbine tower and blade loads are generated using aeroelastic simulation for utility-scale wind turbine models and compared with those based directly on the LES inflow. The study's overall objective is to offer efficient and validated stochastic approaches that are computationally tractable for assessing the performance and loads of turbines operating in wakes.« less

  2. Simulation of fMRI signals to validate dynamic causal modeling estimation

    NASA Astrophysics Data System (ADS)

    Anandwala, Mobin; Siadat, Mohamad-Reza; Hadi, Shamil M.

    2012-03-01

    Through cognitive tasks certain brain areas are activated and also receive increased blood to them. This is modeled through a state system consisting of two separate parts one that deals with the neural node stimulation and the other blood response during that stimulation. The rationale behind using this state system is to validate existing analysis methods such as DCM to see what levels of noise they can handle. Using the forward Euler's method this system was approximated in a series of difference equations. What was obtained was the hemodynamic response for each brain area and this was used to test an analysis tool to estimate functional connectivity between each brain area with a given amount of noise. The importance of modeling this system is to not only have a model for neural response but also to compare to actual data obtained through functional imaging scans.

  3. Validation of the filament winding process model

    NASA Technical Reports Server (NTRS)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  4. Guidance and Control Systems Simulation and Validation Techniques

    DTIC Science & Technology

    1988-07-01

    AGARDograph No.273 GUIDANCE AND CONTROL SYSTEMS SIMULATION AND VALIDATION TECHNIQUES Edited by Dr William P.Albritton, Jr AMTEC Corporation 213 Ridgelawn...AND DEVELOPMENT PROCESS FOR TACTICAL GUIDED WEAPONS by Dr W.PAlbritton, Jr AMTEC Corporation 213 Ridgelawn Drive Athens, AL 35611, USA Summary A brief

  5. Experimental validation of a direct simulation by Monte Carlo molecular gas flow model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shufflebotham, P.K.; Bartel, T.J.; Berney, B.

    1995-07-01

    The Sandia direct simulation Monte Carlo (DSMC) molecular/transition gas flow simulation code has significant potential as a computer-aided design tool for the design of vacuum systems in low pressure plasma processing equipment. The purpose of this work was to verify the accuracy of this code through direct comparison to experiment. To test the DSMC model, a fully instrumented, axisymmetric vacuum test cell was constructed, and spatially resolved pressure measurements made in N{sub 2} at flows from 50 to 500 sccm. In a ``blind`` test, the DSMC code was used to model the experimental conditions directly, and the results compared tomore » the measurements. It was found that the model predicted all the experimental findings to a high degree of accuracy. Only one modeling issue was uncovered. The axisymmetric model showed localized low pressure spots along the axis next to surfaces. Although this artifact did not significantly alter the accuracy of the results, it did add noise to the axial data. {copyright} {ital 1995} {ital American} {ital Vacuum} {ital Society}« less

  6. Validation of a numerical method for interface-resolving simulation of multicomponent gas-liquid mass transfer and evaluation of multicomponent diffusion models

    NASA Astrophysics Data System (ADS)

    Woo, Mino; Wörner, Martin; Tischer, Steffen; Deutschmann, Olaf

    2018-03-01

    The multicomponent model and the effective diffusivity model are well established diffusion models for numerical simulation of single-phase flows consisting of several components but are seldom used for two-phase flows so far. In this paper, a specific numerical model for interfacial mass transfer by means of a continuous single-field concentration formulation is combined with the multicomponent model and effective diffusivity model and is validated for multicomponent mass transfer. For this purpose, several test cases for one-dimensional physical or reactive mass transfer of ternary mixtures are considered. The numerical results are compared with analytical or numerical solutions of the Maxell-Stefan equations and/or experimental data. The composition-dependent elements of the diffusivity matrix of the multicomponent and effective diffusivity model are found to substantially differ for non-dilute conditions. The species mole fraction or concentration profiles computed with both diffusion models are, however, for all test cases very similar and in good agreement with the analytical/numerical solutions or measurements. For practical computations, the effective diffusivity model is recommended due to its simplicity and lower computational costs.

  7. Intestinal biomechanics simulator for robotic capsule endoscope validation.

    PubMed

    Slawinski, Piotr R; Oleynikov, Dmitry; Terry, Benjamin S

    2015-01-01

    This work describes the development and validation of a novel device which simulates important forces experienced by Robotic Capsule Endoscopes (RCE) in vivo in the small intestine. The purpose of the device is to expedite and lower the cost of RCE development. Currently, there is no accurate in vitro test method nor apparatus to validate new RCE designs; therefore, RCEs are tested in vivo at a cost of ∼$1400 per swine test. The authors have developed an in vitro RCE testing device which generates two peristaltic waves to accurately simulate the two biomechanical actions of the human small intestine that are most relevant to RCE locomotion: traction force and contact force. The device was successfully calibrated to match human physiological ranges for traction force (4-40 gf), contact force (80-500 gf) and peristaltic wave propagation speed (0.08-2 cm s(-1)) for a common RCE capsule geometry of 3.5 cm length and 1.5 cm diameter.

  8. Validation of the mean radiant temperature simulated by the RayMan software in urban environments.

    PubMed

    Lee, Hyunjung; Mayer, Helmut

    2016-11-01

    The RayMan software is worldwide applied in investigations on different issues in human-biometeorology. However, only the simulated mean radiant temperature (T mrt ) has been validated so far in a few case studies. They are based on T mrt values, which were experimentally determined in urban environments by use of a globe thermometer or applying the six-directional method. This study analyses previous T mrt validations in a comparative manner. Their results are extended by a recent validation of T mrt in an urban micro-environment in Freiburg (southwest Germany), which can be regarded as relatively heterogeneous due to different shading intensities by tree crowns. In addition, a validation of the physiologically equivalent temperature (PET) simulated by RayMan is conducted for the first time. The validations are based on experimentally determined T mrt and PET values, which were calculated from measured meteorological variables in the daytime of a clear-sky summer day. In total, the validation results show that RayMan is capable of simulating T mrt satisfactorily under relatively homogeneous site conditions. However, the inaccuracy of simulated T mrt is increasing with lower sun elevation and growing heterogeneity of the simulation site. As T mrt represents the meteorological variable that mostly governs PET in the daytime of clear-sky summer days, the accuracy of simulated T mrt is mainly responsible for the accuracy of simulated PET. The T mrt validations result in some recommendations, which concern an update of physical principles applied in the RayMan software to simulate the short- and long-wave radiant flux densities, especially from vertical building walls and tree crowns.

  9. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis.

    PubMed

    Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.

  10. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis

    PubMed Central

    Holgado-Tello, Fco. P.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A.

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity. PMID:27378991

  11. Validation of a Three-Dimensional Ablation and Thermal Response Simulation Code

    NASA Technical Reports Server (NTRS)

    Chen, Yih-Kanq; Milos, Frank S.; Gokcen, Tahir

    2010-01-01

    The 3dFIAT code simulates pyrolysis, ablation, and shape change of thermal protection materials and systems in three dimensions. The governing equations, which include energy conservation, a three-component decomposition model, and a surface energy balance, are solved with a moving grid system to simulate the shape change due to surface recession. This work is the first part of a code validation study for new capabilities that were added to 3dFIAT. These expanded capabilities include a multi-block moving grid system and an orthotropic thermal conductivity model. This paper focuses on conditions with minimal shape change in which the fluid/solid coupling is not necessary. Two groups of test cases of 3dFIAT analyses of Phenolic Impregnated Carbon Ablator in an arc-jet are presented. In the first group, axisymmetric iso-q shaped models are studied to check the accuracy of three-dimensional multi-block grid system. In the second group, similar models with various through-the-thickness conductivity directions are examined. In this group, the material thermal response is three-dimensional, because of the carbon fiber orientation. Predictions from 3dFIAT are presented and compared with arcjet test data. The 3dFIAT predictions agree very well with thermocouple data for both groups of test cases.

  12. Performance optimization and validation of ADM1 simulations under anaerobic thermophilic conditions.

    PubMed

    Atallah, Nabil M; El-Fadel, Mutasem; Ghanimeh, Sophia; Saikaly, Pascal; Abou-Najm, Majdi

    2014-12-01

    In this study, two experimental sets of data each involving two thermophilic anaerobic digesters treating food waste, were simulated using the Anaerobic Digestion Model No. 1 (ADM1). A sensitivity analysis was conducted, using both data sets of one digester, for parameter optimization based on five measured performance indicators: methane generation, pH, acetate, total COD, ammonia, and an equally weighted combination of the five indicators. The simulation results revealed that while optimization with respect to methane alone, a commonly adopted approach, succeeded in simulating methane experimental results, it predicted other intermediary outputs less accurately. On the other hand, the multi-objective optimization has the advantage of providing better results than methane optimization despite not capturing the intermediary output. The results from the parameter optimization were validated upon their independent application on the data sets of the second digester. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Sorbent, Sublimation, and Icing Modeling Methods: Experimental Validation and Application to an Integrated MTSA Subassembly Thermal Model

    NASA Technical Reports Server (NTRS)

    Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.

    2010-01-01

    This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.

  14. Root zone water quality model (RZWQM2): Model use, calibration and validation

    USGS Publications Warehouse

    Ma, Liwang; Ahuja, Lajpat; Nolan, B.T.; Malone, Robert; Trout, Thomas; Qi, Z.

    2012-01-01

    The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model, it has many desirable features for the modeling community. This article outlines the principles of calibrating the model component by component with one or more datasets and validating the model with independent datasets. Users should consult the RZWQM2 user manual distributed along with the model and a more detailed protocol on how to calibrate RZWQM2 provided in a book chapter. Two case studies (or examples) are included in this article. One is from an irrigated maize study in Colorado to illustrate the use of field and laboratory measured soil hydraulic properties on simulated soil water and crop production. It also demonstrates the interaction between soil and plant parameters in simulated plant responses to water stresses. The other is from a maize-soybean rotation study in Iowa to show a manual calibration of the model for crop yield, soil water, and N leaching in tile-drained soils. Although the commonly used trial-and-error calibration method works well for experienced users, as shown in the second example, an automated calibration procedure is more objective, as shown in the first example. Furthermore, the incorporation of the Parameter Estimation Software (PEST) into RZWQM2 made the calibration of the model more efficient than a grid (ordered) search of model parameters. In addition, PEST provides sensitivity and uncertainty analyses that should help users in selecting the right parameters to calibrate.

  15. Enhanced TCAS 2/CDTI traffic Sensor digital simulation model and program description

    NASA Technical Reports Server (NTRS)

    Goka, T.

    1984-01-01

    Digital simulation models of enhanced TCAS 2/CDTI traffic sensors are developed, based on actual or projected operational and performance characteristics. Two enhanced Traffic (or Threat) Alert and Collision Avoidance Systems are considered. A digital simulation program is developed in FORTRAN. The program contains an executive with a semireal time batch processing capability. The simulation program can be interfaced with other modules with a minimum requirement. Both the traffic sensor and CAS logic modules are validated by means of extensive simulation runs. Selected validation cases are discussed in detail, and capabilities and limitations of the actual and simulated systems are noted. The TCAS systems are not specifically intended for Cockpit Display of Traffic Information (CDTI) applications. These systems are sufficiently general to allow implementation of CDTI functions within the real systems' constraints.

  16. Validation of lower tropospheric carbon monoxide inferred from MOZART model simulation over India

    NASA Astrophysics Data System (ADS)

    Yarragunta, Y.; Srivastava, S.; Mitra, D.

    2017-02-01

    In the present study, MOZART-4 (Model for Ozone and Related chemical Tracers-Version-4) simulation has been made from 2003 to 2007 and compared with satellite and in-situ observations with a specific focus on Indian subcontinent to illustrate the capabilities of MOZART-4 model. The model simulated CO have been compared with latest version (version-6) of MOPITT (Measurement Of Pollution In The Troposphere) carbon monoxide (CO) retrievals at 900, 800 and 700 hPa. Model reproduces major features present in satellite observations. However model significantly overestimates CO over the entire Indian region at 900 hPa and moderately overestimates at 800 hPa and 700 hPa. The frequency distribution of all simulated data points with respect to MOZART error shows maximum in the error range of 10-20% at all pressure levels. Over total Indian landmass, the percentage of gridded CO data that are being overestimated in the range of 0-30% at 900 hPa, 800 hPa and 700 hPa are 58%, 62% and 66% respectively. The study reflects very good correlation between two datasets over Central India (CI) and Southern India (SI). The coefficient of determination (r2) is found to be 0.68-0.78 and 0.70-0.78 over the CI and SI respectively. The weak correlation is evident over Northern India (NI) with r2 values of 0.1-0.3. Over Eastern India (EI), Good correlation at 800 hPa (r2 = 0.72) and 700 hPa (r2 = 0.66) whereas moderately weak correlation at 900 hPa (r2 = 0.48) has been observed. In contrast, Over Western India (WI), strong correlation is evident at 900 hPa (r2 = 0.64) and moderately weak association is found to be present at 800 hPa and 700 hPa. Model fairly reproduces seasonal cycle of CO in the lower troposphere over most of the Indian regions. However, during June to December, model shows overestimation over NI. The magnitude of overestimation is increasing linearly from 900 hPa to 700 hPa level. During April-June months, model results are coinciding with observed CO concentrations over SI

  17. Fly's Eye GLM Simulator Preliminary Validation Analysis

    NASA Astrophysics Data System (ADS)

    Quick, M. G.; Christian, H. J., Jr.; Blakeslee, R. J.; Stewart, M. F.; Corredor, D.; Podgorny, S.

    2017-12-01

    As part of the validation effort for the Geostationary Lightning Mapper (GLM) an airborne radiometer array has been fabricated to observe lightning optical emission through the cloud top. The Fly's Eye GLM Simulator (FEGS) is a multi-spectral, photo-electric radiometer array with a nominal spatial resolution of 2 x 2 km and spatial footprint of 10 x 10 km at cloud top. A main 25 pixel array observes the 777.4 nm oxygen emission triplet using an optical passband filter with a 10 nm FWHM, a sampling rate of 100 kHz, and 16 bit resolution. From March to May of 2017 FEGS was flown on the NASA ER-2 high altitude aircraft during the GOES-R Validation Flight Campaign. Optical signatures of lightning were observed during a variety of thunderstorm scenarios while coincident measurements were obtained by GLM and ground based antennae networks. This presentation will describe the preliminary analysis of the FEGS dataset in the context of GLM validation.

  18. Simulation of drift of pesticides: development and validation of a model.

    PubMed

    Brusselman, E; Spanoghe, P; Van der Meeren, P; Gabriels, D; Steurbaut, W

    2003-01-01

    Over the last decade drift of pesticides has been recognized as a major problem for the environment. High fractions of pesticides can be transported through the air and deposited in neighbouring ecosystems during and after application. A new computer-two steps-drift model is developed: FYDRIMO or F(ph)Ysical DRift MOdel. In the first step the droplet size spectrum of a nozzle is analysed. In this way the volume percentage of droplets with a certain size is known. In the second step the model results in a prediction of deposition of each droplet with a certain size. This second part of the model runs in MATLAB and is grounded on a combination of two physical factors: gravity force and friction forces. In this stage of development corrections are included for evaporation and wind force following a certain measured wind profile. For validation wind tunnel experiments were performed. Salt solutions were sprayed at two wind velocities and variable distance above the floor. Small gutters in the floor filled with filter paper were used to collect the sprayed droplets. After analysing and comparing the wind tunnel results with the model predictions, FYDRIMO seems to have good predicting capacities.

  19. The economics of improving medication adherence in osteoporosis: validation and application of a simulation model.

    PubMed

    Patrick, Amanda R; Schousboe, John T; Losina, Elena; Solomon, Daniel H

    2011-09-01

    Adherence to osteoporosis treatment is low. Although new therapies and behavioral interventions may improve medication adherence, questions are likely to arise regarding their cost-effectiveness. Our objectives were to develop and validate a model to simulate the clinical outcomes and costs arising from various osteoporosis medication adherence patterns among women initiating bisphosphonate treatment and to estimate the cost-effectiveness of a hypothetical intervention to improve medication adherence. We constructed a computer simulation using estimates of fracture rates, bisphosphonate treatment effects, costs, and utilities for health states drawn from the published literature. Probabilities of transitioning on and off treatment were estimated from administrative claims data. Patients were women initiating bisphosphonate therapy from the general community. We evaluated a hypothetical behavioral intervention to improve medication adherence. Changes in 10-yr fracture rates and incremental cost-effectiveness ratios were evaluated. A hypothetical intervention with a one-time cost of $250 and reducing bisphosphonate discontinuation by 30% had an incremental cost-effectiveness ratio (ICER) of $29,571 per quality-adjusted life year in 65-yr-old women initiating bisphosphonates. Although the ICER depended on patient age, intervention effectiveness, and intervention cost, the ICERs were less than $50,000 per quality-adjusted life year for the majority of intervention cost and effectiveness scenarios evaluated. Results were sensitive to bisphosphonate cost and effectiveness and assumptions about the rate at which intervention and treatment effects decline over time. Our results suggests that behavioral interventions to improve osteoporosis medication adherence will likely have favorable ICERs if their efficacy can be sustained.

  20. Simulation of particle motion in a closed conduit validated against experimental data

    NASA Astrophysics Data System (ADS)

    Dolanský, Jindřich

    2015-05-01

    Motion of a number of spherical particles in a closed conduit is examined by means of both simulation and experiment. The bed of the conduit is covered by stationary spherical particles of the size of the moving particles. The flow is driven by experimentally measured velocity profiles which are inputs of the simulation. Altering input velocity profiles generates various trajectory patterns. The lattice Boltzmann method (LBM) based simulation is developed to study mutual interactions of the flow and the particles. The simulation enables to model both the particle motion and the fluid flow. The entropic LBM is employed to deal with the flow characterized by the high Reynolds number. The entropic modification of the LBM along with the enhanced refinement of the lattice grid yield an increase in demands on computational resources. Due to the inherently parallel nature of the LBM it can be handled by employing the Parallel Computing Toolbox (MATLAB) and other transformations enabling usage of the CUDA GPU computing technology. The trajectories of the particles determined within the LBM simulation are validated against data gained from the experiments. The compatibility of the simulation results with the outputs of experimental measurements is evaluated. The accuracy of the applied approach is assessed and stability and efficiency of the simulation is also considered.

  1. Filament winding cylinders. II - Validation of the process model

    NASA Technical Reports Server (NTRS)

    Calius, Emilio P.; Lee, Soo-Yong; Springer, George S.

    1990-01-01

    Analytical and experimental studies were performed to validate the model developed by Lee and Springer for simulating the manufacturing process of filament wound composite cylinders. First, results calculated by the Lee-Springer model were compared to results of the Calius-Springer thin cylinder model. Second, temperatures and strains calculated by the Lee-Springer model were compared to data. The data used in these comparisons were generated during the course of this investigation with cylinders made of Hercules IM-6G/HBRF-55 and Fiberite T-300/976 graphite-epoxy tows. Good agreement was found between the calculated and measured stresses and strains, indicating that the model is a useful representation of the winding and curing processes.

  2. Locomotive crashworthiness research : modeling, simulation, and validation

    DOT National Transportation Integrated Search

    2001-07-01

    A technique was developed to realistically simulate the dynamic, nonlinear structural behavior of moving rail vehicles and objects struck during a collision. A new approach considered the interdependence of the many vehicles connected in typical rail...

  3. Integrated Medical Model Verification, Validation, and Credibility

    NASA Technical Reports Server (NTRS)

    Walton, Marlei; Kerstman, Eric; Foy, Millennia; Shah, Ronak; Saile, Lynn; Boley, Lynn; Butler, Doug; Myers, Jerry

    2014-01-01

    The Integrated Medical Model (IMM) was designed to forecast relative changes for a specified set of crew health and mission success risk metrics by using a probabilistic (stochastic process) model based on historical data, cohort data, and subject matter expert opinion. A probabilistic approach is taken since exact (deterministic) results would not appropriately reflect the uncertainty in the IMM inputs. Once the IMM was conceptualized, a plan was needed to rigorously assess input information, framework and code, and output results of the IMM, and ensure that end user requests and requirements were considered during all stages of model development and implementation. METHODS: In 2008, the IMM team developed a comprehensive verification and validation (VV) plan, which specified internal and external review criteria encompassing 1) verification of data and IMM structure to ensure proper implementation of the IMM, 2) several validation techniques to confirm that the simulation capability of the IMM appropriately represents occurrences and consequences of medical conditions during space missions, and 3) credibility processes to develop user confidence in the information derived from the IMM. When the NASA-STD-7009 (7009) was published, the IMM team updated their verification, validation, and credibility (VVC) project plan to meet 7009 requirements and include 7009 tools in reporting VVC status of the IMM. RESULTS: IMM VVC updates are compiled recurrently and include 7009 Compliance and Credibility matrices, IMM VV Plan status, and a synopsis of any changes or updates to the IMM during the reporting period. Reporting tools have evolved over the lifetime of the IMM project to better communicate VVC status. This has included refining original 7009 methodology with augmentation from the NASA-STD-7009 Guidance Document. End user requests and requirements are being satisfied as evidenced by ISS Program acceptance of IMM risk forecasts, transition to an operational model and

  4. Fast Running Urban Dispersion Model for Radiological Dispersal Device (RDD) Releases: Model Description and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gowardhan, Akshay; Neuscamman, Stephanie; Donetti, John

    Aeolus is an efficient three-dimensional computational fluid dynamics code based on finite volume method developed for predicting transport and dispersion of contaminants in a complex urban area. It solves the time dependent incompressible Navier-Stokes equation on a regular Cartesian staggered grid using a fractional step method. It also solves a scalar transport equation for temperature and using the Boussinesq approximation. The model also includes a Lagrangian dispersion model for predicting the transport and dispersion of atmospheric contaminants. The model can be run in an efficient Reynolds Average Navier-Stokes (RANS) mode with a run time of several minutes, or a moremore » detailed Large Eddy Simulation (LES) mode with run time of hours for a typical simulation. This report describes the model components, including details on the physics models used in the code, as well as several model validation efforts. Aeolus wind and dispersion predictions are compared to field data from the Joint Urban Field Trials 2003 conducted in Oklahoma City (Allwine et al 2004) including both continuous and instantaneous releases. Newly implemented Aeolus capabilities include a decay chain model and an explosive Radiological Dispersal Device (RDD) source term; these capabilities are described. Aeolus predictions using the buoyant explosive RDD source are validated against two experimental data sets: the Green Field explosive cloud rise experiments conducted in Israel (Sharon et al 2012) and the Full-Scale RDD Field Trials conducted in Canada (Green et al 2016).« less

  5. Numerical modeling and experimental validation of thermoplastic composites induction welding

    NASA Astrophysics Data System (ADS)

    Palmieri, Barbara; Nele, Luigi; Galise, Francesco

    2018-05-01

    In this work, a numerical simulation and experimental test of the induction welding of continuous fibre-reinforced thermoplastic composites (CFRTPCs) was provided. The thermoplastic Polyamide 66 (PA66) with carbon fiber fabric was used. Using a dedicated software (JMag Designer), the influence of the fundamental process parameters such as temperature, current and holding time was investigated. In order to validate the results of the simulations, and therefore the numerical model used, experimental tests were carried out, and the temperature values measured during the tests were compared with the aid of an optical pyrometer, with those provided by the numerical simulation. The mechanical properties of the welded joints were evaluated by single lap shear tests.

  6. Development and validation of a two-dimensional fast-response flood estimation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judi, David R; Mcpherson, Timothy N; Burian, Steven J

    2009-01-01

    A finite difference formulation of the shallow water equations using an upwind differencing method was developed maintaining computational efficiency and accuracy such that it can be used as a fast-response flood estimation tool. The model was validated using both laboratory controlled experiments and an actual dam breach. Through the laboratory experiments, the model was shown to give good estimations of depth and velocity when compared to the measured data, as well as when compared to a more complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. Themore » simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies complex two-dimensional model. Additionally, the model was compared to high water mark data obtained from the failure of the Taum Sauk dam. The simulated inundation extent agreed well with the observed extent, with the most notable differences resulting from the inability to model sediment transport. The results of these validation studies show that a relatively numerical scheme used to solve the complete shallow water equations can be used to accurately estimate flood inundation. Future work will focus on further reducing the computation time needed to provide flood inundation estimates for fast-response analyses. This will be accomplished through the efficient use of multi-core, multi-processor computers coupled with an efficient domain-tracking algorithm, as well as an understanding of the impacts of grid resolution on model results.« less

  7. Validation of Storm Water Management Model Storm Control Measures Modules

    NASA Astrophysics Data System (ADS)

    Simon, M. A.; Platz, M. C.

    2017-12-01

    EPA's Storm Water Management Model (SWMM) is a computational code heavily relied upon by industry for the simulation of wastewater and stormwater infrastructure performance. Many municipalities are relying on SWMM results to design multi-billion-dollar, multi-decade infrastructure upgrades. Since the 1970's, EPA and others have developed five major releases, the most recent ones containing storm control measures modules for green infrastructure. The main objective of this study was to quantify the accuracy with which SWMM v5.1.10 simulates the hydrologic activity of previously monitored low impact developments. Model performance was evaluated with a mathematical comparison of outflow hydrographs and total outflow volumes, using empirical data and a multi-event, multi-objective calibration method. The calibration methodology utilized PEST++ Version 3, a parameter estimation tool, which aided in the selection of unmeasured hydrologic parameters. From the validation study and sensitivity analysis, several model improvements were identified to advance SWMM LID Module performance for permeable pavements, infiltration units and green roofs, and these were performed and reported herein. Overall, it was determined that SWMM can successfully simulate low impact development controls given accurate model confirmation, parameter measurement, and model calibration.

  8. A PFC3D-based numerical simulation of cutting load for lunar rock simulant and experimental validation

    NASA Astrophysics Data System (ADS)

    Li, Peng; Jiang, Shengyuan; Tang, Dewei; Xu, Bo

    2017-05-01

    For sake of striking a balance between the need of drilling efficiency and the constrains of power budget on the moon, the penetrations per revolution of drill bit are generally limited in the range around 0.1 mm, and besides the geometric angle of the cutting blade need to be well designed. This paper introduces a simulation approach based on PFC3D (particle flow code 3 dimensions) for analyzing the cutting load feature on lunar rock simulant, which is derived from different geometric-angle blades with a small cutting depth. The mean values of the cutting force of five blades in the survey region (four on the boundary points and one on the center point) are selected as the macroscopic responses of model. The method of experimental design which includes Plackett-Burman (PB) design and central composite design (CCD) method is adopted in the matching procedure of microparameters in PFC model. Using the optimization method of enumeration, the optimum set of microparameters is acquired. Then, the experimental validation is implemented by using other twenty-five blades with different geometric angles, and the results from both simulations and laboratory tests give fair agreements. Additionally, the rock breaking process cut by different blades are quantified from simulation analysis. This research provides the theoretical support for the refinement of the rock cutting load prediction and the geometric design of cutting blade on the drill bit.

  9. Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography

    NASA Astrophysics Data System (ADS)

    Revel, G. M.; Pandarese, G.; Cavuto, A.

    2012-06-01

    The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.

  10. Aerosol modelling and validation during ESCOMPTE 2001

    NASA Astrophysics Data System (ADS)

    Cousin, F.; Liousse, C.; Cachier, H.; Bessagnet, B.; Guillaume, B.; Rosset, R.

    The ESCOMPTE 2001 programme (Atmospheric Research. 69(3-4) (2004) 241) has resulted in an exhaustive set of dynamical, radiative, gas and aerosol observations (surface and aircraft measurements). A previous paper (Atmospheric Research. (2004) in press) has dealt with dynamics and gas-phase chemistry. The present paper is an extension to aerosol formation, transport and evolution. To account for important loadings of primary and secondary aerosols and their transformation processes in the ESCOMPTE domain, the ORISAM aerosol module (Atmospheric Environment. 35 (2001) 4751) was implemented on-line in the air-quality Meso-NH-C model. Additional developments have been introduced in ORganic and Inorganic Spectral Aerosol Module (ORISAM) to improve the comparison between simulations and experimental surface and aircraft field data. This paper discusses this comparison for a simulation performed during one selected day, 24 June 2001, during the Intensive Observation Period IOP2b. Our work relies on BC and OCp emission inventories specifically developed for ESCOMPTE. This study confirms the need for a fine resolution aerosol inventory with spectral chemical speciation. BC levels are satisfactorily reproduced, thus validating our emission inventory and its processing through Meso-NH-C. However, comparisons for reactive species generally denote an underestimation of concentrations. Organic aerosol levels are rather well simulated though with a trend to underestimation in the afternoon. Inorganic aerosol species are underestimated for several reasons, some of them have been identified. For sulphates, primary emissions were introduced. Improvement was obtained too for modelled nitrate and ammonium levels after introducing heterogeneous chemistry. However, no modelling of terrigeneous particles is probably a major cause for nitrates and ammonium underestimations. Particle numbers and size distributions are well reproduced, but only in the submicrometer range. Our work points out

  11. Validation of Extended MHD Models using MST RFP Plasmas

    NASA Astrophysics Data System (ADS)

    Jacobson, C. M.; Chapman, B. E.; Craig, D.; McCollam, K. J.; Sovinec, C. R.

    2016-10-01

    Significant effort has been devoted to improvement of computational models used in fusion energy sciences. Rigorous validation of these models is necessary in order to increase confidence in their ability to predict the performance of future devices. MST is a well diagnosed reversed-field pinch (RFP) capable of operation over a wide range of parameters. In particular, the Lundquist number S, a key parameter in resistive magnetohydrodynamics (MHD), can be varied over a wide range and provide substantial overlap with MHD RFP simulations. MST RFP plasmas are simulated using both DEBS, a nonlinear single-fluid visco-resistive MHD code, and NIMROD, a nonlinear extended MHD code, with S ranging from 104 to 5 ×104 for single-fluid runs, with the magnetic Prandtl number Pm = 1 . Experiments with plasma current IP ranging from 60 kA to 500 kA result in S from 4 ×104 to 8 ×106 . Validation metric comparisons are presented, focusing on how magnetic fluctuations b scale with S. Single-fluid NIMROD results give S b - 0.21 , and experiments give S b - 0.28 for the dominant m = 1 , n = 6 mode. Preliminary two-fluid NIMROD results are also presented. Work supported by US DOE.

  12. An RL10A-3-3A rocket engine model using the rocket engine transient simulator (ROCETS) software

    NASA Technical Reports Server (NTRS)

    Binder, Michael

    1993-01-01

    Steady-state and transient computer models of the RL10A-3-3A rocket engine have been created using the Rocket Engine Transient Simulation (ROCETS) code. These models were created for several purposes. The RL10 engine is a critical component of past, present, and future space missions; the model will give NASA an in-house capability to simulate the performance of the engine under various operating conditions and mission profiles. The RL10 simulation activity is also an opportunity to further validate the ROCETS program. The ROCETS code is an important tool for modeling rocket engine systems at NASA Lewis. ROCETS provides a modular and general framework for simulating the steady-state and transient behavior of any desired propulsion system. Although the ROCETS code is being used in a number of different analysis and design projects within NASA, it has not been extensively validated for any system using actual test data. The RL10A-3-3A has a ten year history of test and flight applications; it should provide sufficient data to validate the ROCETS program capability. The ROCETS models of the RL10 system were created using design information provided by Pratt & Whitney, the engine manufacturer. These models are in the process of being validated using test-stand and flight data. This paper includes a brief description of the models and comparison of preliminary simulation output against flight and test-stand data.

  13. Assessment of construct validity of a virtual reality laparoscopy simulator.

    PubMed

    Rosenthal, Rachel; Gantert, Walter A; Hamel, Christian; Hahnloser, Dieter; Metzger, Juerg; Kocher, Thomas; Vogelbach, Peter; Scheidegger, Daniel; Oertli, Daniel; Clavien, Pierre-Alain

    2007-08-01

    The aim of this study was to assess whether virtual reality (VR) can discriminate between the skills of novices and intermediate-level laparoscopic surgical trainees (construct validity), and whether the simulator assessment correlates with an expert's evaluation of performance. Three hundred and seven (307) participants of the 19th-22nd Davos International Gastrointestinal Surgery Workshops performed the clip-and-cut task on the Xitact LS 500 VR simulator (Xitact S.A., Morges, Switzerland). According to their previous experience in laparoscopic surgery, participants were assigned to the basic course (BC) or the intermediate course (IC). Objective performance parameters recorded by the simulator were compared to the standardized assessment by the course instructors during laparoscopic pelvitrainer and conventional surgery exercises. IC participants performed significantly better on the VR simulator than BC participants for the task completion time as well as the economy of movement of the right instrument, not the left instrument. Participants with maximum scores in the pelvitrainer cholecystectomy task performed the VR trial significantly faster, compared to those who scored less. In the conventional surgery task, a significant difference between those who scored the maximum and those who scored less was found not only for task completion time, but also for economy of movement of the right instrument. VR simulation provides a valid assessment of psychomotor skills and some basic aspects of spatial skills in laparoscopic surgery. Furthermore, VR allows discrimination between trainees with different levels of experience in laparoscopic surgery establishing construct validity for the Xitact LS 500 clip-and-cut task. Virtual reality may become the gold standard to assess and monitor surgical skills in laparoscopic surgery.

  14. Hydrological Validation of The Lpj Dynamic Global Vegetation Model - First Results and Required Actions

    NASA Astrophysics Data System (ADS)

    Haberlandt, U.; Gerten, D.; Schaphoff, S.; Lucht, W.

    Dynamic global vegetation models are developed with the main purpose to describe the spatio-temporal dynamics of vegetation at the global scale. Increasing concern about climate change impacts has put the focus of recent applications on the sim- ulation of the global carbon cycle. Water is a prime driver of biogeochemical and biophysical processes, thus an appropriate representation of the water cycle is crucial for their proper simulation. However, these models usually lack thorough validation of the water balance they produce. Here we present a hydrological validation of the current version of the LPJ (Lund- Potsdam-Jena) model, a dynamic global vegetation model operating at daily time steps. Long-term simulated runoff and evapotranspiration are compared to literature values, results from three global hydrological models, and discharge observations from various macroscale river basins. It was found that the seasonal and spatial patterns of the LPJ-simulated average values correspond well both with the measurements and the results from the stand-alone hy- drological models. However, a general underestimation of runoff occurs, which may be attributable to the low input dynamics of precipitation (equal distribution within a month), to the simulated vegetation pattern (potential vegetation without anthro- pogenic influence), and to some generalizations of the hydrological components in LPJ. Future research will focus on a better representation of the temporal variability of climate forcing, improved description of hydrological processes, and on the consider- ation of anthropogenic land use.

  15. High resolution regional climate simulation of the Hawaiian Islands - Validation of the historical run from 2003 to 2012

    NASA Astrophysics Data System (ADS)

    Xue, L.; Newman, A. J.; Ikeda, K.; Rasmussen, R.; Clark, M. P.; Monaghan, A. J.

    2016-12-01

    A high-resolution (a 1.5 km grid spacing domain nested within a 4.5 km grid spacing domain) 10-year regional climate simulation over the entire Hawaiian archipelago is being conducted at the National Center for Atmospheric Research (NCAR) using the Weather Research and Forecasting (WRF) model version 3.7.1. Numerical sensitivity simulations of the Hawaiian Rainband Project (HaRP, a filed experiment from July to August in 1990) showed that the simulated precipitation properties are sensitive to initial and lateral boundary conditions, sea surface temperature (SST), land surface models, vertical resolution and cloud droplet concentration. The validations of model simulated statistics of the trade wind inversion, temperature, wind field, cloud cover, and precipitation over the islands against various observations from soundings, satellites, weather stations and rain gauges during the period from 2003 to 2012 will be presented at the meeting.

  16. Development and validation of a predictive model for the influences of selected product and process variables on ascorbic acid degradation in simulated fruit juice.

    PubMed

    Gabriel, Alonzo A; Cayabyab, Jochelle Elysse C; Tan, Athalie Kaye L; Corook, Mark Lester F; Ables, Errol John O; Tiangson-Bayaga, Cecile Leah P

    2015-06-15

    A predictive response surface model for the influences of product (soluble solids and titratable acidity) and process (temperature and heating time) parameters on the degradation of ascorbic acid (AA) in heated simulated fruit juices (SFJs) was established. Physicochemical property ranges of freshly squeezed and processed juices, and a previously established decimal reduction times of Escherichiacoli O157:H7 at different heating temperatures were used in establishing a Central Composite Design of Experiment that determined the combinations of product and process variable used in the model building. Only the individual linear effects of temperature and heating time significantly (P<0.05) affected AA reduction (%AAr). Validating systems either over- or underestimated actual %AAr with bias factors 0.80-1.20. However, all validating systems still resulted in acceptable predictive efficacy, with accuracy factor 1.00-1.26. The model may be useful in establishing unique process schedules for specific products, for the simultaneous control and improvement of food safety and quality. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Validation of snow characteristics and snow albedo feedback in the Canadian Regional Climate Model simulations over North America

    NASA Astrophysics Data System (ADS)

    Fang, B.; Sushama, L.; Diro, G. T.

    2015-12-01

    Snow characteristics and snow albedo feedback (SAF) over North America, as simulated by the fifth-generation Canadian Regional Climate Model (CRCM5), when driven by ERA-40/ERA-Interim, CanESM2 and MPI-ESM-LR at the lateral boundaries, are analyzed in this study. Validation of snow characteristics is performed by comparing simulations against available observations from MODIS, ISCCP and CMC. Results show that the model is able to represent the main spatial distribution of snow characteristics with some overestimation in snow mass and snow depth over the Canadian high Arctic. Some overestimation in surface albedo is also noted for the boreal region which is believed to be related to the snow unloading parameterization, as well as the overestimation of snow albedo. SAF is assessed both in seasonal and climate change contexts when possible. The strength of SAF is quantified as the amount of additional net shortwave radiation at the top of the atmosphere as surface albedo decreases in association with a 1°C increase in surface temperature. Following Qu and Hall (2007), this is expressed as the product of the variation in planetary albedo with surface albedo and the change in surface albedo for 1°C change in surface air temperature during the season, which in turn is determined by the strength of the snow cover and snowpack metamorphosis feedback loops. Analysis of the latter term in the seasonal cycle suggests that for CRCM5 simulations, the snow cover feedback loop is more dominant compared to the snowpack metamorphosis feedback loop, whereas for MODIS, the two feedback loops have more or less similar strength. Moreover, the SAF strength in the climate change context appears to be weaker than in the seasonal cycle and is sensitive to the driving GCM and the RCP scenario.

  18. Cloud radiative effects and changes simulated by the Coupled Model Intercomparison Project Phase 5 models

    NASA Astrophysics Data System (ADS)

    Shin, Sun-Hee; Kim, Ok-Yeon; Kim, Dongmin; Lee, Myong-In

    2017-07-01

    Using 32 CMIP5 (Coupled Model Intercomparison Project Phase 5) models, this study examines the veracity in the simulation of cloud amount and their radiative effects (CREs) in the historical run driven by observed external radiative forcing for 1850-2005, and their future changes in the RCP (Representative Concentration Pathway) 4.5 scenario runs for 2006-2100. Validation metrics for the historical run are designed to examine the accuracy in the representation of spatial patterns for climatological mean, and annual and interannual variations of clouds and CREs. The models show large spread in the simulation of cloud amounts, specifically in the low cloud amount. The observed relationship between cloud amount and the controlling large-scale environment are also reproduced diversely by various models. Based on the validation metrics, four models—ACCESS1.0, ACCESS1.3, HadGEM2-CC, and HadGEM2-ES—are selected as best models, and the average of the four models performs more skillfully than the multimodel ensemble average. All models project global-mean SST warming at the increase of the greenhouse gases, but the magnitude varies across the simulations between 1 and 2 K, which is largely attributable to the difference in the change of cloud amount and distribution. The models that simulate more SST warming show a greater increase in the net CRE due to reduced low cloud and increased incoming shortwave radiation, particularly over the regions of marine boundary layer in the subtropics. Selected best-performing models project a significant reduction in global-mean cloud amount of about -0.99% K-1 and net radiative warming of 0.46 W m-2 K-1, suggesting a role of positive feedback to global warming.

  19. The Effect of Time-Advance Mechanism in Modeling and Simulation

    DTIC Science & Technology

    2011-09-01

    dissertation specifically covers the following issues : 1. The simulation field lacks studies that allow modelers understand the impact of TAM on...question or issue being modeled when the comparison of two dissimilar models to address the same question or problem is prepared. Modern military forces...scenarios and specific performance data outcomes will be analyzed for validity and compared against one another. Critical issues that have major

  20. Validation of Survivability Validation Protocols

    DTIC Science & Technology

    1993-05-01

    simu- lation fidelityl. Physical testing of P.i SOS, in either aboveground tests (AGTs) or underground test ( UGTs ), will usually be impossible, due...with some simulation fidelity compromises) are possible in UGTs and/orAGTs. Hence proof tests, if done in statistically significant numbers, can...level. Simulation fidelity and AGT/ UGT /threat correlation will be validation issues here. Extrapolation to threat environments will be done via modeling

  1. Climate Change Impacts for Conterminous USA: An Integrated Assessment Part 2. Models and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomson, Allison M.; Rosenberg, Norman J.; Izaurralde, R Cesar C.

    As CO{sub 2} and other greenhouse gases accumulate in the atmosphere and contribute to rising global temperatures, it is important to examine how a changing climate may affect natural and managed ecosystems. In this series of papers, we study the impacts of climate change on agriculture, water resources and natural ecosystems in the conterminous United States using a suite of climate change predictions from General Circulation Models (GCMs) as described in Part 1. Here we describe the agriculture model EPIC and the HUMUS water model and validate them with historical crop yields and streamflow data. We compare EPIC simulated grainmore » and forage crop yields with historical crop yields from the US Department of Agriculture and find an acceptable level of agreement for this study. The validation of HUMUS simulated streamflow with estimates of natural streamflow from the US Geological Survey shows that the model is able to reproduce significant relationships and capture major trends.« less

  2. Modelling of diesel engine fuelled with biodiesel using engine simulation software

    NASA Astrophysics Data System (ADS)

    Said, Mohd Farid Muhamad; Said, Mazlan; Aziz, Azhar Abdul

    2012-06-01

    This paper is about modelling of a diesel engine that operates using biodiesel fuels. The model is used to simulate or predict the performance and combustion of the engine by simplified the geometry of engine component in the software. The model is produced using one-dimensional (1D) engine simulation software called GT-Power. The fuel properties library in the software is expanded to include palm oil based biodiesel fuels. Experimental works are performed to investigate the effect of biodiesel fuels on the heat release profiles and the engine performance curves. The model is validated with experimental data and good agreement is observed. The simulation results show that combustion characteristics and engine performances differ when biodiesel fuels are used instead of no. 2 diesel fuel.

  3. Base Flow Model Validation

    NASA Technical Reports Server (NTRS)

    Sinha, Neeraj; Brinckman, Kevin; Jansen, Bernard; Seiner, John

    2011-01-01

    A method was developed of obtaining propulsive base flow data in both hot and cold jet environments, at Mach numbers and altitude of relevance to NASA launcher designs. The base flow data was used to perform computational fluid dynamics (CFD) turbulence model assessments of base flow predictive capabilities in order to provide increased confidence in base thermal and pressure load predictions obtained from computational modeling efforts. Predictive CFD analyses were used in the design of the experiments, available propulsive models were used to reduce program costs and increase success, and a wind tunnel facility was used. The data obtained allowed assessment of CFD/turbulence models in a complex flow environment, working within a building-block procedure to validation, where cold, non-reacting test data was first used for validation, followed by more complex reacting base flow validation.

  4. Validation of Monte Carlo simulation of mammography with TLD measurement and depth dose calculation with a detailed breast model

    NASA Astrophysics Data System (ADS)

    Wang, Wenjing; Qiu, Rui; Ren, Li; Liu, Huan; Wu, Zhen; Li, Chunyan; Li, Junli

    2017-09-01

    Mean glandular dose (MGD) is not only determined by the compressed breast thickness (CBT) and the glandular content, but also by the distribution of glandular tissues in breast. Depth dose inside the breast in mammography has been widely concerned as glandular dose decreases rapidly with increasing depth. In this study, an experiment using thermo luminescent dosimeters (TLDs) was carried out to validate Monte Carlo simulations of mammography. Percent depth doses (PDDs) at different depth values were measured inside simple breast phantoms of different thicknesses. The experimental values were well consistent with the values calculated by Geant4. Then a detailed breast model with a CBT of 4 cm and a glandular content of 50%, which has been constructed in previous work, was used to study the effects of the distribution of glandular tissues in breast with Geant4. The breast model was reversed in direction of compression to get a reverse model with a different distribution of glandular tissues. Depth dose distributions and glandular tissue dose conversion coefficients were calculated. It revealed that the conversion coefficients were about 10% larger when the breast model was reversed, for glandular tissues in the reverse model are concentrated in the upper part of the model.

  5. Testing the Construct Validity of a Virtual Reality Hip Arthroscopy Simulator.

    PubMed

    Khanduja, Vikas; Lawrence, John E; Audenaert, Emmanuel

    2017-03-01

    To test the construct validity of the hip diagnostics module of a virtual reality hip arthroscopy simulator. Nineteen orthopaedic surgeons performed a simulated arthroscopic examination of a healthy hip joint using a 70° arthroscope in the supine position. Surgeons were categorized as either expert (those who had performed 250 hip arthroscopies or more) or novice (those who had performed fewer than this). Twenty-one specific targets were visualized within the central and peripheral compartments; 9 via the anterior portal, 9 via the anterolateral portal, and 3 via the posterolateral portal. This was immediately followed by a task testing basic probe examination of the joint in which a series of 8 targets were probed via the anterolateral portal. During the tasks, the surgeon's performance was evaluated by the simulator using a set of predefined metrics including task duration, number of soft tissue and bone collisions, and distance travelled by instruments. No repeat attempts at the tasks were permitted. Construct validity was then evaluated by comparing novice and expert group performance metrics over the 2 tasks using the Mann-Whitney test, with a P value of less than .05 considered significant. On the visualization task, the expert group outperformed the novice group on time taken (P = .0003), number of collisions with soft tissue (P = .001), number of collisions with bone (P = .002), and distance travelled by the arthroscope (P = .02). On the probe examination, the 2 groups differed only in the time taken to complete the task (P = .025) with no significant difference in other metrics. Increased experience in hip arthroscopy was reflected by significantly better performance on the virtual reality simulator across 2 tasks, supporting its construct validity. This study validates a virtual reality hip arthroscopy simulator and supports its potential for developing basic arthroscopic skills. Level III. Copyright © 2016 Arthroscopy Association of North America

  6. Modeling and Simulation of Quenching and Tempering Process in steels

    NASA Astrophysics Data System (ADS)

    Deng, Xiaohu; Ju, Dongying

    Quenching and tempering (Q&T) is a combined heat treatment process to achieve maximum toughness and ductility at a specified hardness and strength. It is important to develop a mathematical model for quenching and tempering process for satisfy requirement of mechanical properties with low cost. This paper presents a modified model to predict structural evolution and hardness distribution during quenching and tempering process of steels. The model takes into account tempering parameters, carbon content, isothermal and non-isothermal transformations. Moreover, precipitation of transition carbides, decomposition of retained austenite and precipitation of cementite can be simulated respectively. Hardness distributions of quenched and tempered workpiece are predicted by experimental regression equation. In order to validate the model, it is employed to predict the tempering of 80MnCr5 steel. The predicted precipitation dynamics of transition carbides and cementite is consistent with the previous experimental and simulated results from literature. Then the model is implemented within the framework of the developed simulation code COSMAP to simulate microstructure, stress and distortion in the heat treated component. It is applied to simulate Q&T process of J55 steel. The calculated results show a good agreement with the experimental ones. This agreement indicates that the model is effective for simulation of Q&T process of steels.

  7. Simulations and model of the nonlinear Richtmyer–Meshkov instability

    DOE PAGES

    Dimonte, Guy; Ramaprabhu, P.

    2010-01-21

    The nonlinear evolution of the Richtmyer-Meshkov (RM) instability is investigated using numerical simulations with the FLASH code in two-dimensions (2D). The purpose of the simulations is to develop an empiricial nonlinear model of the RM instability that is applicable to inertial confinement fusion (ICF) and ejecta formation, namely, at large Atwood number A and scaled initial amplitude kh o (k ≡ wavenumber) of the perturbation. The FLASH code is first validated with a variety of RM experiments that evolve well into the nonlinear regime. They reveal that bubbles stagnate when they grow by an increment of 2/k and that spikesmore » accelerate for A > 0.5 due to higher harmonics that focus them. These results are then compared with a variety of nonlinear models that are based on potential flow. We find that the models agree with simulations for moderate values of A < 0.9 and kh o< 1, but not for the larger values that characterize ICF and ejecta formation. We thus develop a new nonlinear empirical model that captures the simulation results consistent with potential flow for a broader range of A and kh o. Our hope is that such empirical models concisely capture the RM simulations and inspire more rigorous solutions.« less

  8. Modeling and simulation of a 2-DOF bidirectional electrothermal microactuator

    NASA Astrophysics Data System (ADS)

    Topaloglu, N.; Elbuken, C.; Nieva, P. M.; Yavuz, M.; Huissoon, J. P.

    2008-03-01

    In this paper we present the modeling and simulation of a 2 degree-of-freedom (DOF) bidirectional electrothermal actuator. The four arm microactuator was designed to move in both the horizontal and vertical axes. By tailoring the geometrical parameters of the design, the in-plane and out-of-plane motions were decoupled, resulting in enhanced mobility in both directions. The motion of the actuator was modeled analytically using an electro-thermo-mechanical analysis. To validate the analytical model, finite element simulations were performed using ANSYS. The microactuators were fabricated using PolyMUMPS process and experimental results show good agreement with both the analytical model and the simulations. We demonstrated that the 2-DOF bidirectional electrothermal actuator can achieve 3.7 μm in-plane and 13.3 μm out-of-plane deflections with an input voltage of 10 V.

  9. Modelling and validation of Proton exchange membrane fuel cell (PEMFC)

    NASA Astrophysics Data System (ADS)

    Mohiuddin, A. K. M.; Basran, N.; Khan, A. A.

    2018-01-01

    This paper is the outcome of a small scale fuel cell project. Fuel cell is an electrochemical device that converts energy from chemical reaction to electrical work. Proton Exchange Membrane Fuel Cell (PEMFC) is one of the different types of fuel cell, which is more efficient, having low operational temperature and fast start up capability results in high energy density. In this study, a mathematical model of 1.2 W PEMFC is developed and simulated using MATLAB software. This model describes the PEMFC behaviour under steady-state condition. This mathematical modeling of PEMFC determines the polarization curve, power generated, and the efficiency of the fuel cell. Simulation results were validated by comparing with experimental results obtained from the test of a single PEMFC with a 3 V motor. The performance of experimental PEMFC is little lower compared to simulated PEMFC, however both results were found in good agreement. Experiments on hydrogen flow rate also been conducted to obtain the amount of hydrogen consumed to produce electrical work on PEMFC.

  10. Can We Study Autonomous Driving Comfort in Moving-Base Driving Simulators? A Validation Study.

    PubMed

    Bellem, Hanna; Klüver, Malte; Schrauf, Michael; Schöner, Hans-Peter; Hecht, Heiko; Krems, Josef F

    2017-05-01

    To lay the basis of studying autonomous driving comfort using driving simulators, we assessed the behavioral validity of two moving-base simulator configurations by contrasting them with a test-track setting. With increasing level of automation, driving comfort becomes increasingly important. Simulators provide a safe environment to study perceived comfort in autonomous driving. To date, however, no studies were conducted in relation to comfort in autonomous driving to determine the extent to which results from simulator studies can be transferred to on-road driving conditions. Participants ( N = 72) experienced six differently parameterized lane-change and deceleration maneuvers and subsequently rated the comfort of each scenario. One group of participants experienced the maneuvers on a test-track setting, whereas two other groups experienced them in one of two moving-base simulator configurations. We could demonstrate relative and absolute validity for one of the two simulator configurations. Subsequent analyses revealed that the validity of the simulator highly depends on the parameterization of the motion system. Moving-base simulation can be a useful research tool to study driving comfort in autonomous vehicles. However, our results point at a preference for subunity scaling factors for both lateral and longitudinal motion cues, which might be explained by an underestimation of speed in virtual environments. In line with previous studies, we recommend lateral- and longitudinal-motion scaling factors of approximately 50% to 60% in order to obtain valid results for both active and passive driving tasks.

  11. Second-Moment RANS Model Verification and Validation Using the Turbulence Modeling Resource Website (Invited)

    NASA Technical Reports Server (NTRS)

    Eisfeld, Bernhard; Rumsey, Chris; Togiti, Vamshi

    2015-01-01

    The implementation of the SSG/LRR-omega differential Reynolds stress model into the NASA flow solvers CFL3D and FUN3D and the DLR flow solver TAU is verified by studying the grid convergence of the solution of three different test cases from the Turbulence Modeling Resource Website. The model's predictive capabilities are assessed based on four basic and four extended validation cases also provided on this website, involving attached and separated boundary layer flows, effects of streamline curvature and secondary flow. Simulation results are compared against experimental data and predictions by the eddy-viscosity models of Spalart-Allmaras (SA) and Menter's Shear Stress Transport (SST).

  12. Virtual milk for modelling and simulation of dairy processes.

    PubMed

    Munir, M T; Zhang, Y; Yu, W; Wilson, D I; Young, B R

    2016-05-01

    The modeling of dairy processing using a generic process simulator suffers from shortcomings, given that many simulators do not contain milk components in their component libraries. Recently, pseudo-milk components for a commercial process simulator were proposed for simulation and the current work extends this pseudo-milk concept by studying the effect of both total milk solids and temperature on key physical properties such as thermal conductivity, density, viscosity, and heat capacity. This paper also uses expanded fluid and power law models to predict milk viscosity over the temperature range from 4 to 75°C and develops a succinct regressed model for heat capacity as a function of temperature and fat composition. The pseudo-milk was validated by comparing the simulated and actual values of the physical properties of milk. The milk thermal conductivity, density, viscosity, and heat capacity showed differences of less than 2, 4, 3, and 1.5%, respectively, between the simulated results and actual values. This work extends the capabilities of the previously proposed pseudo-milk and of a process simulator to model dairy processes, processing different types of milk (e.g., whole milk, skim milk, and concentrated milk) with different intrinsic compositions, and to predict correct material and energy balances for dairy processes. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. Assessing performance and validating finite element simulations using probabilistic knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolin, Ronald M.; Rodriguez, E. A.

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrencemore » results are used to validate finite element predictions.« less

  14. Validation of a 2.5D CFD model for cylindrical gas–solids fluidized beds

    DOE PAGES

    Li, Tingwen

    2015-09-25

    The 2.5D model recently proposed by Li et al. (Li, T., Benyahia, S., Dietiker, J., Musser, J., and Sun, X., 2015. A 2.5D computational method to simulate cylindrical fluidized beds. Chemical Engineering Science. 123, 236-246.) was validated for two cylindrical gas-solids bubbling fluidized bed systems. Different types of particles tested under various flow conditions were simulated using the traditional 2D model and the 2.5D model. Detailed comparison against the experimental measurements on solid concentration and velocity were conducted. Comparing to the traditional Cartesian 2D flow simulation, the 2.5D model yielded better agreement with the experimental data especially for the solidmore » velocity prediction in the column wall region.« less

  15. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berk, Alexander; Hawes, Frederick; Fox, Marsha

    2016-03-15

    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development ofmore » fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation

  16. In Vitro Simulation and Validation of the Circulation with Congenital Heart Defects

    PubMed Central

    Figliola, Richard S.; Giardini, Alessandro; Conover, Tim; Camp, Tiffany A.; Biglino, Giovanni; Chiulli, John; Hsia, Tain-Yen

    2010-01-01

    Despite the recent advances in computational modeling, experimental simulation of the circulation with congenital heart defect using mock flow circuits remains an important tool for device testing, and for detailing the probable flow consequences resulting from surgical and interventional corrections. Validated mock circuits can be applied to qualify the results from novel computational models. New mathematical tools, coupled with advanced clinical imaging methods, allow for improved assessment of experimental circuit performance relative to human function, as well as the potential for patient-specific adaptation. In this review, we address the development of three in vitro mock circuits specific for studies of congenital heart defects. Performance of an in vitro right heart circulation circuit through a series of verification and validation exercises is described, including correlations with animal studies, and quantifying the effects of circuit inertiance on test results. We present our experience in the design of mock circuits suitable for investigations of the characteristics of the Fontan circulation. We use one such mock circuit to evaluate the accuracy of Doppler predictions in the presence of aortic coarctation. PMID:21218147

  17. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-01-01

    Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  18. Validation of the second-generation Olympus colonoscopy simulator for skills assessment.

    PubMed

    Haycock, A V; Bassett, P; Bladen, J; Thomas-Gibson, S

    2009-11-01

    Simulators have potential value in providing objective evidence of technical skill for procedures within medicine. The aim of this study was to determine face and construct validity for the Olympus colonoscopy simulator and to establish which assessment measures map to clinical benchmarks of expertise. Thirty-four participants were recruited: 10 novices with no prior colonoscopy experience, 13 intermediate (trainee) endoscopists with fewer than 1000 previous colonoscopies, and 11 experienced endoscopists with more than 1000 previous colonoscopies. All participants completed three standardized cases on the simulator and experts gave feedback regarding the realism of the simulator. Forty metrics recorded automatically by the simulator were analyzed for their ability to distinguish between the groups. The simulator discriminated participants by experience level for 22 different parameters. Completion rates were lower for novices than for trainees and experts (37 % vs. 79 % and 88 % respectively, P < 0.001) and both novices and trainees took significantly longer to reach all major landmarks than the experts. Several technical aspects of competency were discriminatory; pushing with an embedded tip ( P = 0.03), correct use of the variable stiffness function ( P = 0.004), number of sigmoid N-loops ( P = 0.02); size of sigmoid N-loops ( P = 0.01), and time to remove alpha loops ( P = 0.004). Out of 10, experts rated the realism of movement at 6.4, force feedback at 6.6, looping at 6.6, and loop resolution at 6.8. The Olympus colonoscopy simulator has good face validity and excellent construct validity. It provides an objective assessment of colonoscopic skill on multiple measures and benchmarks have been set to allow its use as both a formative and a summative assessment tool. Georg Thieme Verlag KG Stuttgart. New York.

  19. Genomic prediction in animals and plants: simulation of data, validation, reporting, and benchmarking.

    PubMed

    Daetwyler, Hans D; Calus, Mario P L; Pong-Wong, Ricardo; de Los Campos, Gustavo; Hickey, John M

    2013-02-01

    The genomic prediction of phenotypes and breeding values in animals and plants has developed rapidly into its own research field. Results of genomic prediction studies are often difficult to compare because data simulation varies, real or simulated data are not fully described, and not all relevant results are reported. In addition, some new methods have been compared only in limited genetic architectures, leading to potentially misleading conclusions. In this article we review simulation procedures, discuss validation and reporting of results, and apply benchmark procedures for a variety of genomic prediction methods in simulated and real example data. Plant and animal breeding programs are being transformed by the use of genomic data, which are becoming widely available and cost-effective to predict genetic merit. A large number of genomic prediction studies have been published using both simulated and real data. The relative novelty of this area of research has made the development of scientific conventions difficult with regard to description of the real data, simulation of genomes, validation and reporting of results, and forward in time methods. In this review article we discuss the generation of simulated genotype and phenotype data, using approaches such as the coalescent and forward in time simulation. We outline ways to validate simulated data and genomic prediction results, including cross-validation. The accuracy and bias of genomic prediction are highlighted as performance indicators that should be reported. We suggest that a measure of relatedness between the reference and validation individuals be reported, as its impact on the accuracy of genomic prediction is substantial. A large number of methods were compared in example simulated and real (pine and wheat) data sets, all of which are publicly available. In our limited simulations, most methods performed similarly in traits with a large number of quantitative trait loci (QTL), whereas in traits

  20. Genomic Prediction in Animals and Plants: Simulation of Data, Validation, Reporting, and Benchmarking

    PubMed Central

    Daetwyler, Hans D.; Calus, Mario P. L.; Pong-Wong, Ricardo; de los Campos, Gustavo; Hickey, John M.

    2013-01-01

    The genomic prediction of phenotypes and breeding values in animals and plants has developed rapidly into its own research field. Results of genomic prediction studies are often difficult to compare because data simulation varies, real or simulated data are not fully described, and not all relevant results are reported. In addition, some new methods have been compared only in limited genetic architectures, leading to potentially misleading conclusions. In this article we review simulation procedures, discuss validation and reporting of results, and apply benchmark procedures for a variety of genomic prediction methods in simulated and real example data. Plant and animal breeding programs are being transformed by the use of genomic data, which are becoming widely available and cost-effective to predict genetic merit. A large number of genomic prediction studies have been published using both simulated and real data. The relative novelty of this area of research has made the development of scientific conventions difficult with regard to description of the real data, simulation of genomes, validation and reporting of results, and forward in time methods. In this review article we discuss the generation of simulated genotype and phenotype data, using approaches such as the coalescent and forward in time simulation. We outline ways to validate simulated data and genomic prediction results, including cross-validation. The accuracy and bias of genomic prediction are highlighted as performance indicators that should be reported. We suggest that a measure of relatedness between the reference and validation individuals be reported, as its impact on the accuracy of genomic prediction is substantial. A large number of methods were compared in example simulated and real (pine and wheat) data sets, all of which are publicly available. In our limited simulations, most methods performed similarly in traits with a large number of quantitative trait loci (QTL), whereas in traits

  1. Validation of the ROMI-RIP rough mill simulator

    Treesearch

    Edward R. Thomas; Urs Buehlmann

    2002-01-01

    The USDA Forest Service's ROMI-RIP rough mill rip-first simulation program is a popular tool for analyzing rough mill conditions, determining more efficient rough mill practices, and finding optimal lumber board cut-up patterns. However, until now, the results generated by ROMI-RIP have not been rigorously compared to those of an actual rough mill. Validating the...

  2. Development and validation of real-time simulation of X-ray imaging with respiratory motion.

    PubMed

    Vidal, Franck P; Villard, Pierre-Frédéric

    2016-04-01

    We present a framework that combines evolutionary optimisation, soft tissue modelling and ray tracing on GPU to simultaneously compute the respiratory motion and X-ray imaging in real-time. Our aim is to provide validated building blocks with high fidelity to closely match both the human physiology and the physics of X-rays. A CPU-based set of algorithms is presented to model organ behaviours during respiration. Soft tissue deformation is computed with an extension of the Chain Mail method. Rigid elements move according to kinematic laws. A GPU-based surface rendering method is proposed to compute the X-ray image using the Beer-Lambert law. It is provided as an open-source library. A quantitative validation study is provided to objectively assess the accuracy of both components: (i) the respiration against anatomical data, and (ii) the X-ray against the Beer-Lambert law and the results of Monte Carlo simulations. Our implementation can be used in various applications, such as interactive medical virtual environment to train percutaneous transhepatic cholangiography in interventional radiology, 2D/3D registration, computation of digitally reconstructed radiograph, simulation of 4D sinograms to test tomography reconstruction tools. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. DES Y1 Results: Validating Cosmological Parameter Estimation Using Simulated Dark Energy Surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacCrann, N.; et al.

    We use mock galaxy survey simulations designed to resemble the Dark Energy Survey Year 1 (DES Y1) data to validate and inform cosmological parameter estimation. When similar analysis tools are applied to both simulations and real survey data, they provide powerful validation tests of the DES Y1 cosmological analyses presented in companion papers. We use two suites of galaxy simulations produced using different methods, which therefore provide independent tests of our cosmological parameter inference. The cosmological analysis we aim to validate is presented in DES Collaboration et al. (2017) and uses angular two-point correlation functions of galaxy number counts and weak lensing shear, as well as their cross-correlation, in multiple redshift bins. While our constraints depend on the specific set of simulated realisations available, for both suites of simulations we find that the input cosmology is consistent with the combined constraints from multiple simulated DES Y1 realizations in themore » $$\\Omega_m-\\sigma_8$$ plane. For one of the suites, we are able to show with high confidence that any biases in the inferred $$S_8=\\sigma_8(\\Omega_m/0.3)^{0.5}$$ and $$\\Omega_m$$ are smaller than the DES Y1 $$1-\\sigma$$ uncertainties. For the other suite, for which we have fewer realizations, we are unable to be this conclusive; we infer a roughly 70% probability that systematic biases in the recovered $$\\Omega_m$$ and $$S_8$$ are sub-dominant to the DES Y1 uncertainty. As cosmological analyses of this kind become increasingly more precise, validation of parameter inference using survey simulations will be essential to demonstrate robustness.« less

  4. Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes

    NASA Astrophysics Data System (ADS)

    Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico

    2017-12-01

    Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.

  5. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Climate simulations and projections with a super-parameterized climate model

    DOE PAGES

    Stan, Cristiana; Xu, Li

    2014-07-01

    The mean climate and its variability are analyzed in a suite of numerical experiments with a fully coupled general circulation model in which subgrid-scale moist convection is explicitly represented through embedded 2D cloud-system resolving models. Control simulations forced by the present day, fixed atmospheric carbon dioxide concentration are conducted using two horizontal resolutions and validated against observations and reanalyses. The mean state simulated by the higher resolution configuration has smaller biases. Climate variability also shows some sensitivity to resolution but not as uniform as in the case of mean state. The interannual and seasonal variability are better represented in themore » simulation at lower resolution whereas the subseasonal variability is more accurate in the higher resolution simulation. The equilibrium climate sensitivity of the model is estimated from a simulation forced by an abrupt quadrupling of the atmospheric carbon dioxide concentration. The equilibrium climate sensitivity temperature of the model is 2.77 °C, and this value is slightly smaller than the mean value (3.37 °C) of contemporary models using conventional representation of cloud processes. As a result, the climate change simulation forced by the representative concentration pathway 8.5 scenario projects an increase in the frequency of severe droughts over most of the North America.« less

  7. Beyond Corroboration: Strengthening Model Validation by Looking for Unexpected Patterns

    PubMed Central

    Chérel, Guillaume; Cottineau, Clémentine; Reuillon, Romain

    2015-01-01

    Models of emergent phenomena are designed to provide an explanation to global-scale phenomena from local-scale processes. Model validation is commonly done by verifying that the model is able to reproduce the patterns to be explained. We argue that robust validation must not only be based on corroboration, but also on attempting to falsify the model, i.e. making sure that the model behaves soundly for any reasonable input and parameter values. We propose an open-ended evolutionary method based on Novelty Search to look for the diverse patterns a model can produce. The Pattern Space Exploration method was tested on a model of collective motion and compared to three common a priori sampling experiment designs. The method successfully discovered all known qualitatively different kinds of collective motion, and performed much better than the a priori sampling methods. The method was then applied to a case study of city system dynamics to explore the model’s predicted values of city hierarchisation and population growth. This case study showed that the method can provide insights on potential predictive scenarios as well as falsifiers of the model when the simulated dynamics are highly unrealistic. PMID:26368917

  8. Computer simulation of Cerebral Arteriovenous Malformation-validation analysis of hemodynamics parameters.

    PubMed

    Kumar, Y Kiran; Mehta, Shashi Bhushan; Ramachandra, Manjunath

    2017-01-01

    The purpose of this work is to provide some validation methods for evaluating the hemodynamic assessment of Cerebral Arteriovenous Malformation (CAVM). This article emphasizes the importance of validating noninvasive measurements for CAVM patients, which are designed using lumped models for complex vessel structure. The validation of the hemodynamics assessment is based on invasive clinical measurements and cross-validation techniques with the Philips proprietary validated software's Qflow and 2D Perfursion. The modeling results are validated for 30 CAVM patients for 150 vessel locations. Mean flow, diameter, and pressure were compared between modeling results and with clinical/cross validation measurements, using an independent two-tailed Student t test. Exponential regression analysis was used to assess the relationship between blood flow, vessel diameter, and pressure between them. Univariate analysis is used to assess the relationship between vessel diameter, vessel cross-sectional area, AVM volume, AVM pressure, and AVM flow results were performed with linear or exponential regression. Modeling results were compared with clinical measurements from vessel locations of cerebral regions. Also, the model is cross validated with Philips proprietary validated software's Qflow and 2D Perfursion. Our results shows that modeling results and clinical results are nearly matching with a small deviation. In this article, we have validated our modeling results with clinical measurements. The new approach for cross-validation is proposed by demonstrating the accuracy of our results with a validated product in a clinical environment.

  9. Simulation of a G-tolerance curve using the pulsatile cardiovascular model

    NASA Technical Reports Server (NTRS)

    Solomon, M.; Srinivasan, R.

    1985-01-01

    A computer simulation study, performed to assess the ability of the cardiovascular model to reproduce the G tolerance curve (G level versus tolerance time) is reported. A composite strength duration curve derived from experimental data obtained in human centrifugation studies was used for comparison. The effects of abolishing automomic control and of blood volume loss on G tolerance were also simulated. The results provide additional validation of the model. The need for the presence of autonomic reflexes even at low levels of G is pointed out. The low margin of safety with a loss of blood volume indicated by the simulation results underscores the necessity for protective measures during Shuttle reentry.

  10. Validation of design procedure and performance modeling of a heat and fluid transport field experiment in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Nir, A.; Doughty, C.; Tsang, C. F.

    Validation methods which developed in the context of deterministic concepts of past generations often cannot be directly applied to environmental problems, which may be characterized by limited reproducibility of results and highly complex models. Instead, validation is interpreted here as a series of activities, including both theoretical and experimental tests, designed to enhance our confidence in the capability of a proposed model to describe some aspect of reality. We examine the validation process applied to a project concerned with heat and fluid transport in porous media, in which mathematical modeling, simulation, and results of field experiments are evaluated in order to determine the feasibility of a system for seasonal thermal energy storage in shallow unsaturated soils. Technical details of the field experiments are not included, but appear in previous publications. Validation activities are divided into three stages. The first stage, carried out prior to the field experiments, is concerned with modeling the relevant physical processes, optimization of the heat-exchanger configuration and the shape of the storage volume, and multi-year simulation. Subjects requiring further theoretical and experimental study are identified at this stage. The second stage encompasses the planning and evaluation of the initial field experiment. Simulations are made to determine the experimental time scale and optimal sensor locations. Soil thermal parameters and temperature boundary conditions are estimated using an inverse method. Then results of the experiment are compared with model predictions using different parameter values and modeling approximations. In the third stage, results of an experiment performed under different boundary conditions are compared to predictions made by the models developed in the second stage. Various aspects of this theoretical and experimental field study are described as examples of the verification and validation procedure. There is no

  11. An ocular biomechanic model for dynamic simulation of different eye movements.

    PubMed

    Iskander, J; Hossny, M; Nahavandi, S; Del Porto, L

    2018-04-11

    Simulating and analysing eye movement is useful for assessing visual system contribution to discomfort with respect to body movements, especially in virtual environments where simulation sickness might occur. It can also be used in the design of eye prosthesis or humanoid robot eye. In this paper, we present two biomechanic ocular models that are easily integrated into the available musculoskeletal models. The model was previously used to simulate eye-head coordination. The models are used to simulate and analyse eye movements. The proposed models are based on physiological and kinematic properties of the human eye. They incorporate an eye-globe, orbital suspension tissues and six muscles with their connective tissues (pulleys). Pulleys were incorporated in rectus and inferior oblique muscles. The two proposed models are the passive pulleys and the active pulleys models. Dynamic simulations of different eye movements, including fixation, saccade and smooth pursuit, are performed to validate both models. The resultant force-length curves of the models were similar to the experimental data. The simulation results show that the proposed models are suitable to generate eye movement simulations with results comparable to other musculoskeletal models. The maximum kinematic root mean square error (RMSE) is 5.68° and 4.35° for the passive and active pulley models, respectively. The analysis of the muscle forces showed realistic muscle activation with increased muscle synergy in the active pulley model. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Validation and Calibration of Nuclear Thermal Hydraulics Multiscale Multiphysics Models - Subcooled Flow Boiling Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anh Bui; Nam Dinh; Brian Williams

    In addition to validation data plan, development of advanced techniques for calibration and validation of complex multiscale, multiphysics nuclear reactor simulation codes are a main objective of the CASL VUQ plan. Advanced modeling of LWR systems normally involves a range of physico-chemical models describing multiple interacting phenomena, such as thermal hydraulics, reactor physics, coolant chemistry, etc., which occur over a wide range of spatial and temporal scales. To a large extent, the accuracy of (and uncertainty in) overall model predictions is determined by the correctness of various sub-models, which are not conservation-laws based, but empirically derived from measurement data. Suchmore » sub-models normally require extensive calibration before the models can be applied to analysis of real reactor problems. This work demonstrates a case study of calibration of a common model of subcooled flow boiling, which is an important multiscale, multiphysics phenomenon in LWR thermal hydraulics. The calibration process is based on a new strategy of model-data integration, in which, all sub-models are simultaneously analyzed and calibrated using multiple sets of data of different types. Specifically, both data on large-scale distributions of void fraction and fluid temperature and data on small-scale physics of wall evaporation were simultaneously used in this work’s calibration. In a departure from traditional (or common-sense) practice of tuning/calibrating complex models, a modern calibration technique based on statistical modeling and Bayesian inference was employed, which allowed simultaneous calibration of multiple sub-models (and related parameters) using different datasets. Quality of data (relevancy, scalability, and uncertainty) could be taken into consideration in the calibration process. This work presents a step forward in the development and realization of the “CIPS Validation Data Plan” at the Consortium for Advanced Simulation of LWRs to

  13. Validation of a small-animal PET simulation using GAMOS: a GEANT4-based framework

    NASA Astrophysics Data System (ADS)

    Cañadas, M.; Arce, P.; Rato Mendes, P.

    2011-01-01

    Monte Carlo-based modelling is a powerful tool to help in the design and optimization of positron emission tomography (PET) systems. The performance of these systems depends on several parameters, such as detector physical characteristics, shielding or electronics, whose effects can be studied on the basis of realistic simulated data. The aim of this paper is to validate a comprehensive study of the Raytest ClearPET small-animal PET scanner using a new Monte Carlo simulation platform which has been developed at CIEMAT (Madrid, Spain), called GAMOS (GEANT4-based Architecture for Medicine-Oriented Simulations). This toolkit, based on the GEANT4 code, was originally designed to cover multiple applications in the field of medical physics from radiotherapy to nuclear medicine, but has since been applied by some of its users in other fields of physics, such as neutron shielding, space physics, high energy physics, etc. Our simulation model includes the relevant characteristics of the ClearPET system, namely, the double layer of scintillator crystals in phoswich configuration, the rotating gantry, the presence of intrinsic radioactivity in the crystals or the storage of single events for an off-line coincidence sorting. Simulated results are contrasted with experimental acquisitions including studies of spatial resolution, sensitivity, scatter fraction and count rates in accordance with the National Electrical Manufacturers Association (NEMA) NU 4-2008 protocol. Spatial resolution results showed a discrepancy between simulated and measured values equal to 8.4% (with a maximum FWHM difference over all measurement directions of 0.5 mm). Sensitivity results differ less than 1% for a 250-750 keV energy window. Simulated and measured count rates agree well within a wide range of activities, including under electronic saturation of the system (the measured peak of total coincidences, for the mouse-sized phantom, was 250.8 kcps reached at 0.95 MBq mL-1 and the simulated peak was

  14. A model of fluid and solute exchange in the human: validation and implications.

    PubMed

    Bert, J L; Gyenge, C C; Bowen, B D; Reed, R K; Lund, T

    2000-11-01

    In order to understand better the complex, dynamic behaviour of the redistribution and exchange of fluid and solutes administered to normal individuals or to those with acute hypovolemia, mathematical models are used in addition to direct experimental investigation. Initial validation of a model developed by our group involved data from animal experiments (Gyenge, C.C., Bowen, B.D., Reed, R.K. & Bert, J.L. 1999b. Am J Physiol 277 (Heart Circ Physiol 46), H1228-H1240). For a first validation involving humans, we compare the results of simulations with a wide range of different types of data from two experimental studies. These studies involved administration of normal saline or hypertonic saline with Dextran to both normal and 10% haemorrhaged subjects. We compared simulations with data including the dynamic changes in plasma and interstitial fluid volumes VPL and VIT respectively, plasma and interstitial colloid osmotic pressures PiPL and PiIT respectively, haematocrit (Hct), plasma solute concentrations and transcapillary flow rates. The model predictions were overall in very good agreement with the wide range of experimental results considered. Based on the conditions investigated, the model was also validated for humans. We used the model both to investigate mechanisms associated with the redistribution and transport of fluid and solutes administered following a mild haemorrhage and to speculate on the relationship between the timing and amount of fluid infusions and subsequent blood volume expansion.

  15. Use of soft data for multi-criteria calibration and validation of APEX: Impact on model simulations

    USDA-ARS?s Scientific Manuscript database

    It is widely known that the use of soft data and multiple model performance criteria in model calibration and validation is critical to ensuring the model capture major hydrologic and water quality processes. The Agricultural Policy/Environmental eXtender (APEX) is a hydrologic and water quality mod...

  16. Modeling and simulation for space medicine operations: preliminary requirements considered

    NASA Technical Reports Server (NTRS)

    Dawson, D. L.; Billica, R. D.; McDonald, P. V.

    2001-01-01

    The NASA Space Medicine program is now developing plans for more extensive use of high-fidelity medical simulation systems. The use of simulation is seen as means to more effectively use the limited time available for astronaut medical training. Training systems should be adaptable for use in a variety of training environments, including classrooms or laboratories, space vehicle mockups, analog environments, and in microgravity. Modeling and simulation can also provide the space medicine development program a mechanism for evaluation of other medical technologies under operationally realistic conditions. Systems and procedures need preflight verification with ground-based testing. Traditionally, component testing has been accomplished, but practical means for "human in the loop" verification of patient care systems have been lacking. Medical modeling and simulation technology offer potential means to accomplish such validation work. Initial considerations in the development of functional requirements and design standards for simulation systems for space medicine are discussed.

  17. Development and user validation of driving tasks for a power wheelchair simulator.

    PubMed

    Archambault, Philippe S; Blackburn, Émilie; Reid, Denise; Routhier, François; Miller, William C

    2017-07-01

    Mobility is important for participation in daily activities and a power wheelchair (PW) can improve quality of life of individuals with mobility impairments. A virtual reality simulator may be helpful in complementing PW skills training, which is generally seen as insufficient by both clinicians and PW users. To this end, specific, ecologically valid activities, such as entering an elevator and navigating through a shopping mall crowd, have been added to the McGill wheelchair (miWe) simulator through a user-centred approach. The objective of this study was to validate the choice of simulated activities in a group of newly trained PW users. We recruited 17 new PW users, who practiced with the miWe simulator at home for two weeks. They then related their experience through the Short Feedback Questionnaire, the perceived Ease of Use Questionnaire, and semi-structured interviews. Participants in general greatly appreciated their experience with the simulator. During the interviews, this group made similar comments about the activities as our previous group of expert PW users had done. They also insisted on the importance of realism in the miWe activities, for their use in training. A PW simulator may be helpful if it supports the practice of activities in specific contexts (such as a bathroom or supermarket), to complement the basic skills training received in the clinic (such as driving forward, backward, turning, and avoiding obstacles). Implications for Rehabilitation New power wheelchair users appreciate practicing on a virtual reality simulator and find the experience useful when the simulated diving activities are realistic and ecologically valid. User-centred development can lead to simulated power wheelchair activities that adequately capture everyday driving challenges experienced in various environmental contexts.

  18. The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Goulet, C. A.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.

    2016-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100 Hz) ground motions for earthquakes at regional scales. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The BBP scientific software modules implement kinematic rupture generation, low- and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, several ground motion intensity measure calculations, and various ground motion goodness-of-fit tools. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground-motion seismograms, using multiple alternative ground motion simulation methods, and software utilities to generate tables, plots, and maps. The BBP has been developed over the last five years in a collaborative project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The SCEC BBP software released in 2016 can be compiled and run on recent Linux and Mac OS X systems with GNU compilers. It includes five simulation methods, seven simulation regions covering California, Japan, and Eastern North America, and the ability to compare simulation results against empirical ground motion models (aka GMPEs). The latest version includes updated ground motion simulation methods, a suite of new validation metrics and a simplified command line user interface.

  19. Simulations of an Offshore Wind Farm Using Large-Eddy Simulation and a Torque-Controlled Actuator Disc Model

    NASA Astrophysics Data System (ADS)

    Creech, Angus; Früh, Wolf-Gerrit; Maguire, A. Eoghan

    2015-05-01

    We present here a computational fluid dynamics (CFD) simulation of Lillgrund offshore wind farm, which is located in the Øresund Strait between Sweden and Denmark. The simulation combines a dynamic representation of wind turbines embedded within a large-eddy simulation CFD solver and uses hr-adaptive meshing to increase or decrease mesh resolution where required. This allows the resolution of both large-scale flow structures around the wind farm, and the local flow conditions at individual turbines; consequently, the response of each turbine to local conditions can be modelled, as well as the resulting evolution of the turbine wakes. This paper provides a detailed description of the turbine model which simulates the interaction between the wind, the turbine rotors, and the turbine generators by calculating the forces on the rotor, the body forces on the air, and instantaneous power output. This model was used to investigate a selection of key wind speeds and directions, investigating cases where a row of turbines would be fully aligned with the wind or at specific angles to the wind. Results shown here include presentations of the spin-up of turbines, the observation of eddies moving through the turbine array, meandering turbine wakes, and an extensive wind farm wake several kilometres in length. The key measurement available for cross-validation with operational wind farm data is the power output from the individual turbines, where the effect of unsteady turbine wakes on the performance of downstream turbines was a main point of interest. The results from the simulations were compared to the performance measurements from the real wind farm to provide a firm quantitative validation of this methodology. Having achieved good agreement between the model results and actual wind farm measurements, the potential of the methodology to provide a tool for further investigations of engineering and atmospheric science problems is outlined.

  20. A highly coarse-grained model to simulate entangled polymer melts.

    PubMed

    Zhu, You-Liang; Liu, Hong; Lu, Zhong-Yuan

    2012-04-14

    We introduce a highly coarse-grained model to simulate the entangled polymer melts. In this model, a polymer chain is taken as a single coarse-grained particle, and the creation and annihilation of entanglements are regarded as stochastic events in proper time intervals according to certain rules and possibilities. We build the relationship between the probability of appearance of an entanglement between any pair of neighboring chains at a given time interval and the rate of variation of entanglements which describes the concurrence of birth and death of entanglements. The probability of disappearance of entanglements is tuned to keep the total entanglement number around the target value. This useful model can reflect many characteristics of entanglements and macroscopic properties of polymer melts. As an illustration, we apply this model to simulate the polyethylene melt of C(1000)H(2002) at 450 K and further validate this model by comparing to experimental data and other simulation results.

  1. Advanced Reactors-Intermediate Heat Exchanger (IHX) Coupling: Theoretical Modeling and Experimental Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard

    2016-12-29

    The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less

  2. Validation of PV-RPM Code in the System Advisor Model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Geoffrey Taylor; Lavrova, Olga; Freeman, Janine

    2017-04-01

    This paper describes efforts made by Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) to validate the SNL developed PV Reliability Performance Model (PV - RPM) algorithm as implemented in the NREL System Advisor Model (SAM). The PV - RPM model is a library of functions that estimates component failure and repair in a photovoltaic system over a desired simulation period. The failure and repair distributions in this paper are probabilistic representations of component failure and repair based on data collected by SNL for a PV power plant operating in Arizona. The validation effort focuses on whethermore » the failure and repair dist ributions used in the SAM implementation result in estimated failures that match the expected failures developed in the proof - of - concept implementation. Results indicate that the SAM implementation of PV - RPM provides the same results as the proof - of - concep t implementation, indicating the algorithms were reproduced successfully.« less

  3. Catchment-scale Validation of a Physically-based, Post-fire Runoff and Erosion Model

    NASA Astrophysics Data System (ADS)

    Quinn, D.; Brooks, E. S.; Robichaud, P. R.; Dobre, M.; Brown, R. E.; Wagenbrenner, J.

    2017-12-01

    The cascading consequences of fire-induced ecological changes have profound impacts on both natural and managed forest ecosystems. Forest managers tasked with implementing post-fire mitigation strategies need robust tools to evaluate the effectiveness of their decisions, particularly those affecting hydrological recovery. Various hillslope-scale interfaces of the physically-based Water Erosion Prediction Project (WEPP) model have been successfully validated for this purpose using fire-effected plot experiments, however these interfaces are explicitly designed to simulate single hillslopes. Spatially-distributed, catchment-scale WEPP interfaces have been developed over the past decade, however none have been validated for post-fire simulations, posing a barrier to adoption for forest managers. In this validation study, we compare WEPP simulations with pre- and post-fire hydrological records for three forested catchments (W. Willow, N. Thomas, and S. Thomas) that burned in the 2011 Wallow Fire in Northeastern Arizona, USA. Simulations were conducted using two approaches; the first using automatically created inputs from an online, spatial, post-fire WEPP interface, and the second using manually created inputs which incorporate the spatial variability of fire effects observed in the field. Both approaches were compared to five years of observed post-fire sediment and flow data to assess goodness of fit.

  4. Atomistic simulations of ultra-short pulse laser ablation of aluminum: validity of the Lambert-Beer law

    NASA Astrophysics Data System (ADS)

    Eisfeld, Eugen; Roth, Johannes

    2018-05-01

    Based on hybrid molecular dynamics/two-temperature simulations, we study the validity of the application of Lambert-Beer's law, which is conveniently used in various modeling approaches of ultra-short pulse laser ablation of metals. The method is compared to a more rigorous treatment, which involves solving the Helmholtz wave equation for different pulse durations ranging from 100 fs to 5 ps and a wavelength of 800 nm. Our simulations show a growing agreement with increasing pulse durations, and we provide appropriate optical parameters for all investigated pulse durations.

  5. Consequence modeling using the fire dynamics simulator.

    PubMed

    Ryder, Noah L; Sutula, Jason A; Schemel, Christopher F; Hamer, Andrew J; Van Brunt, Vincent

    2004-11-11

    The use of Computational Fluid Dynamics (CFD) and in particular Large Eddy Simulation (LES) codes to model fires provides an efficient tool for the prediction of large-scale effects that include plume characteristics, combustion product dispersion, and heat effects to adjacent objects. This paper illustrates the strengths of the Fire Dynamics Simulator (FDS), an LES code developed by the National Institute of Standards and Technology (NIST), through several small and large-scale validation runs and process safety applications. The paper presents two fire experiments--a small room fire and a large (15 m diameter) pool fire. The model results are compared to experimental data and demonstrate good agreement between the models and data. The validation work is then extended to demonstrate applicability to process safety concerns by detailing a model of a tank farm fire and a model of the ignition of a gaseous fuel in a confined space. In this simulation, a room was filled with propane, given time to disperse, and was then ignited. The model yields accurate results of the dispersion of the gas throughout the space. This information can be used to determine flammability and explosive limits in a space and can be used in subsequent models to determine the pressure and temperature waves that would result from an explosion. The model dispersion results were compared to an experiment performed by Factory Mutual. Using the above examples, this paper will demonstrate that FDS is ideally suited to build realistic models of process geometries in which large scale explosion and fire failure risks can be evaluated with several distinct advantages over more traditional CFD codes. Namely transient solutions to fire and explosion growth can be produced with less sophisticated hardware (lower cost) than needed for traditional CFD codes (PC type computer verses UNIX workstation) and can be solved for longer time histories (on the order of hundreds of seconds of computed time) with

  6. An eleven-year validation of a physically-based distributed dynamic ecohydorological model tRIBS+VEGGIE: Walnut Gulch Experimental Watershed

    NASA Astrophysics Data System (ADS)

    Sivandran, G.; Bisht, G.; Ivanov, V. Y.; Bras, R. L.

    2008-12-01

    A coupled, dynamic vegetation and hydrologic model, tRIBS+VEGGIE, was applied to the semiarid Walnut Gulch Experimental Watershed in Arizona. The physically-based, distributed nature of the coupled model allows for parameterization and simulation of watershed vegetation-water-energy dynamics on timescales varying from hourly to interannual. The model also allows for explicit spatial representation of processes that vary due to complex topography, such as lateral redistribution of moisture and partitioning of radiation with respect to aspect and slope. Model parameterization and forcing was conducted using readily available databases for topography, soil types, and land use cover as well as the data from network of meteorological stations located within the Walnut Gulch watershed. In order to test the performance of the model, three sets of simulations were conducted over an 11 year period from 1997 to 2007. Two simulations focus on heavily instrumented nested watersheds within the Walnut Gulch basin; (i) Kendall watershed, which is dominated by annual grasses; and (ii) Lucky Hills watershed, which is dominated by a mixture of deciduous and evergreen shrubs. The third set of simulations cover the entire Walnut Gulch Watershed. Model validation and performance were evaluated in relation to three broad categories; (i) energy balance components: the network of meteorological stations were used to validate the key energy fluxes; (ii) water balance components: the network of flumes, rain gauges and soil moisture stations installed within the watershed were utilized to validate the manner in which the model partitions moisture; and (iii) vegetation dynamics: remote sensing products from MODIS were used to validate spatial and temporal vegetation dynamics. Model results demonstrate satisfactory spatial and temporal agreement with observed data, giving confidence that key ecohydrological processes can be adequately represented for future applications of tRIBS+VEGGIE in

  7. Face and construct validity of a computer-based virtual reality simulator for ERCP.

    PubMed

    Bittner, James G; Mellinger, John D; Imam, Toufic; Schade, Robert R; Macfadyen, Bruce V

    2010-02-01

    Currently, little evidence supports computer-based simulation for ERCP training. To determine face and construct validity of a computer-based simulator for ERCP and assess its perceived utility as a training tool. Novice and expert endoscopists completed 2 simulated ERCP cases by using the GI Mentor II. Virtual Education and Surgical Simulation Laboratory, Medical College of Georgia. Outcomes included times to complete the procedure, reach the papilla, and use fluoroscopy; attempts to cannulate the papilla, pancreatic duct, and common bile duct; and number of contrast injections and complications. Subjects assessed simulator graphics, procedural accuracy, difficulty, haptics, overall realism, and training potential. Only when performance data from cases A and B were combined did the GI Mentor II differentiate novices and experts based on times to complete the procedure, reach the papilla, and use fluoroscopy. Across skill levels, overall opinions were similar regarding graphics (moderately realistic), accuracy (similar to clinical ERCP), difficulty (similar to clinical ERCP), overall realism (moderately realistic), and haptics. Most participants (92%) claimed that the simulator has definite training potential or should be required for training. Small sample size, single institution. The GI Mentor II demonstrated construct validity for ERCP based on select metrics. Most subjects thought that the simulated graphics, procedural accuracy, and overall realism exhibit face validity. Subjects deemed it a useful training tool. Study repetition involving more participants and cases may help confirm results and establish the simulator's ability to differentiate skill levels based on ERCP-specific metrics.

  8. Geometric modeling of the temporal bone for cochlea implant simulation

    NASA Astrophysics Data System (ADS)

    Todd, Catherine A.; Naghdy, Fazel; O'Leary, Stephen

    2004-05-01

    The first stage in the development of a clinically valid surgical simulator for training otologic surgeons in performing cochlea implantation is presented. For this purpose, a geometric model of the temporal bone has been derived from a cadaver specimen using the biomedical image processing software package Analyze (AnalyzeDirect, Inc) and its three-dimensional reconstruction is examined. Simulator construction begins with registration and processing of a Computer Tomography (CT) medical image sequence. Important anatomical structures of the middle and inner ear are identified and segmented from each scan in a semi-automated threshold-based approach. Linear interpolation between image slices produces a three-dimensional volume dataset: the geometrical model. Artefacts are effectively eliminated using a semi-automatic seeded region-growing algorithm and unnecessary bony structures are removed. Once validated by an Ear, Nose and Throat (ENT) specialist, the model may be imported into the Reachin Application Programming Interface (API) (Reachin Technologies AB) for visual and haptic rendering associated with a virtual mastoidectomy. Interaction with the model is realized with haptics interfacing, providing the user with accurate torque and force feedback. Electrode array insertion into the cochlea will be introduced in the final stage of design.

  9. Testing and validating environmental models

    USGS Publications Warehouse

    Kirchner, J.W.; Hooper, R.P.; Kendall, C.; Neal, C.; Leavesley, G.

    1996-01-01

    Generally accepted standards for testing and validating ecosystem models would benefit both modellers and model users. Universally applicable test procedures are difficult to prescribe, given the diversity of modelling approaches and the many uses for models. However, the generally accepted scientific principles of documentation and disclosure provide a useful framework for devising general standards for model evaluation. Adequately documenting model tests requires explicit performance criteria, and explicit benchmarks against which model performance is compared. A model's validity, reliability, and accuracy can be most meaningfully judged by explicit comparison against the available alternatives. In contrast, current practice is often characterized by vague, subjective claims that model predictions show 'acceptable' agreement with data; such claims provide little basis for choosing among alternative models. Strict model tests (those that invalid models are unlikely to pass) are the only ones capable of convincing rational skeptics that a model is probably valid. However, 'false positive' rates as low as 10% can substantially erode the power of validation tests, making them insufficiently strict to convince rational skeptics. Validation tests are often undermined by excessive parameter calibration and overuse of ad hoc model features. Tests are often also divorced from the conditions under which a model will be used, particularly when it is designed to forecast beyond the range of historical experience. In such situations, data from laboratory and field manipulation experiments can provide particularly effective tests, because one can create experimental conditions quite different from historical data, and because experimental data can provide a more precisely defined 'target' for the model to hit. We present a simple demonstration showing that the two most common methods for comparing model predictions to environmental time series (plotting model time series

  10. Hierarchical calibration and validation of computational fluid dynamics models for solid sorbent-based carbon capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lai, Canhai; Xu, Zhijie; Pan, Wenxiao

    2016-01-01

    To quantify the predictive confidence of a solid sorbent-based carbon capture design, a hierarchical validation methodology—consisting of basic unit problems with increasing physical complexity coupled with filtered model-based geometric upscaling has been developed and implemented. This paper describes the computational fluid dynamics (CFD) multi-phase reactive flow simulations and the associated data flows among different unit problems performed within the said hierarchical validation approach. The bench-top experiments used in this calibration and validation effort were carefully designed to follow the desired simple-to-complex unit problem hierarchy, with corresponding data acquisition to support model parameters calibrations at each unit problem level. A Bayesianmore » calibration procedure is employed and the posterior model parameter distributions obtained at one unit-problem level are used as prior distributions for the same parameters in the next-tier simulations. Overall, the results have demonstrated that the multiphase reactive flow models within MFIX can be used to capture the bed pressure, temperature, CO2 capture capacity, and kinetics with quantitative accuracy. The CFD modeling methodology and associated uncertainty quantification techniques presented herein offer a solid framework for estimating the predictive confidence in the virtual scale up of a larger carbon capture device.« less

  11. Non-local electron transport validation using 2D DRACO simulations

    NASA Astrophysics Data System (ADS)

    Cao, Duc; Chenhall, Jeff; Moll, Eli; Prochaska, Alex; Moses, Gregory; Delettrez, Jacques; Collins, Tim

    2012-10-01

    Comparison of 2D DRACO simulations, using a modified versionfootnotetextprivate communications with M. Marinak and G. Zimmerman, LLNL. of the Schurtz, Nicolai and Busquet (SNB) algorithmfootnotetextSchurtz, Nicolai and Busquet, ``A nonlocal electron conduction model for multidimensional radiation hydrodynamics codes,'' Phys. Plasmas 7, 4238(2000). for non-local electron transport, with direct drive shock timing experimentsfootnotetextT. Boehly, et. al., ``Multiple spherically converging shock waves in liquid deuterium,'' Phys. Plasmas 18, 092706(2011). and with the Goncharov non-local modelfootnotetextV. Goncharov, et. al., ``Early stage of implosion in inertial confinement fusion: Shock timing and perturbation evolution,'' Phys. Plasmas 13, 012702(2006). in 1D LILAC will be presented. Addition of an improved SNB non-local electron transport algorithm in DRACO allows direct drive simulations with no need for an electron conduction flux limiter. Validation with shock timing experiments that mimic the laser pulse profile of direct drive ignition targets gives a higher confidence level in the predictive capability of the DRACO code. This research was supported by the University of Rochester Laboratory for Laser Energetics.

  12. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    NASA Astrophysics Data System (ADS)

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  13. FDA Benchmark Medical Device Flow Models for CFD Validation.

    PubMed

    Malinauskas, Richard A; Hariharan, Prasanna; Day, Steven W; Herbertson, Luke H; Buesen, Martin; Steinseifer, Ulrich; Aycock, Kenneth I; Good, Bryan C; Deutsch, Steven; Manning, Keefe B; Craven, Brent A

    Computational fluid dynamics (CFD) is increasingly being used to develop blood-contacting medical devices. However, the lack of standardized methods for validating CFD simulations and blood damage predictions limits its use in the safety evaluation of devices. Through a U.S. Food and Drug Administration (FDA) initiative, two benchmark models of typical device flow geometries (nozzle and centrifugal blood pump) were tested in multiple laboratories to provide experimental velocities, pressures, and hemolysis data to support CFD validation. In addition, computational simulations were performed by more than 20 independent groups to assess current CFD techniques. The primary goal of this article is to summarize the FDA initiative and to report recent findings from the benchmark blood pump model study. Discrepancies between CFD predicted velocities and those measured using particle image velocimetry most often occurred in regions of flow separation (e.g., downstream of the nozzle throat, and in the pump exit diffuser). For the six pump test conditions, 57% of the CFD predictions of pressure head were within one standard deviation of the mean measured values. Notably, only 37% of all CFD submissions contained hemolysis predictions. This project aided in the development of an FDA Guidance Document on factors to consider when reporting computational studies in medical device regulatory submissions. There is an accompanying podcast available for this article. Please visit the journal's Web site (www.asaiojournal.com) to listen.

  14. Validation of computer simulation training for esophagogastroduodenoscopy: Pilot study.

    PubMed

    Sedlack, Robert E

    2007-08-01

    Little is known regarding the value of esophagogastroduodenoscopy (EGD) simulators in education. The purpose of the present paper was to validate the use of computer simulation in novice EGD training. In phase 1, expert endoscopists evaluated various aspects of simulation fidelity as compared to live endoscopy. Additionally, computer-recorded performance metrics were assessed by comparing the recorded scores from users of three different experience levels. In phase 2, the transfer of simulation-acquired skills to the clinical setting was assessed in a two-group, randomized pilot study. The setting was a large gastroenterology (GI) Fellowship training program; in phase 1, 21 subjects (seven expert, intermediate and novice endoscopist), made up the three experience groups. In phase 2, eight novice GI fellows were involved in the two-group, randomized portion of the study examining the transfer of simulation skills to the clinical setting. During the initial validation phase, each of the 21 subjects completed two standardized EDG scenarios on a computer simulator and their performance scores were recorded for seven parameters. Following this, staff participants completed a questionnaire evaluating various aspects of the simulator's fidelity. Finally, four novice GI fellows were randomly assigned to receive 6 h of simulator-augmented training (SAT group) in EGD prior to beginning 1 month of patient-based EGD training. The remaining fellows experienced 1 month of patient-based training alone (PBT group). Results of the seven measured performance parameters were compared between three groups of varying experience using a Wilcoxon ranked sum test. The staffs' simulator fidelity survey used a 7-point Likert scale (1, very unrealistic; 4, neutral; 7, very realistic) for each of the parameters examined. During the second phase of this study, supervising staff rated both SAT and PBT fellows' patient-based performance daily. Scoring in each skill was completed using a 7-point

  15. Multiphase flow modeling and simulation of explosive volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Neri, Augusto

    Recent worldwide volcanic activity, such as eruptions at Mt. St. Helens, Washington, in 1980, Mt. Pinatubo, Philippines, in 1991, as well as the ongoing eruption at Montserrat, West Indies, highlighted again the complex nature of explosive volcanic eruptions as well as the tremendous risk associated to them. In the year 2000, about 500 million people are expected to live under the shadow of an active volcano. The understanding of pyroclastic dispersion processes produced by explosive eruptions is, therefore, of primary interest, not only from the scientific point of view, but also for the huge worldwide risk associated with them. The thesis deals with an interdisciplinary research aimed at the modeling and simulation of explosive volcanic eruptions by using multiphase thermo-fluid-dynamic models. The first part of the work was dedicated to the understanding and validation of recently developed kinetic theory of two-phase flow. The hydrodynamics of fluid catalytic cracking particles in the IIT riser were simulated and compared with lab experiments. Simulation results confirm the validity of the kinetic theory approach. Transport of solids in the riser is due to dense clusters. On a time-average basis the bottom of the riser and the walls are dense, in agreement with IIT experimental data. The low frequency of oscillation (about 0.2 Hz) is also in agreement with data. The second part of the work was devoted to the development of transient two-dimensional multiphase and multicomponent flow models of pyroclastic dispersion processes. In particular, the dynamics of ground-hugging high-speed and high-temperature pyroclastic flows generated by the collapse of volcanic columns or by impulsive discrete explosions, was investigated. The model accounts for the mechanical and thermal non-equilibrium between a multicomponent gas phase and N different solid phases representative of pyroclastic particles of different sizes. Pyroclastic dispersion dynamics describes the formation

  16. Conceptual Design of Simulation Models in an Early Development Phase of Lunar Spacecraft Simulator Using SMP2 Standard

    NASA Astrophysics Data System (ADS)

    Lee, Hoon Hee; Koo, Cheol Hea; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    The conceptual study for Korean lunar orbiter/lander prototype has been performed in Korea Aerospace Research Institute (KARI). Across diverse space programs around European countries, a variety of simulation application has been developed using SMP2 (Simulation Modelling Platform) standard related to portability and reuse of simulation models by various model users. KARI has not only first-hand experience of a development of SMP compatible simulation environment but also an ongoing study to apply the SMP2 development process of simulation model to a simulator development project for lunar missions. KARI has tried to extend the coverage of the development domain based on SMP2 standard across the whole simulation model life-cycle from software design to its validation through a lunar exploration project. Figure. 1 shows a snapshot from a visualization tool for the simulation of lunar lander motion. In reality, a demonstrator prototype on the right-hand side of image was made and tested in 2012. In an early phase of simulator development prior to a kick-off start in the near future, targeted hardware to be modelled has been investigated and indentified at the end of 2012. The architectural breakdown of the lunar simulator at system level was performed and the architecture with a hierarchical tree of models from the system to parts at lower level has been established. Finally, SMP Documents such as Catalogue, Assembly, Schedule and so on were converted using a XML(eXtensible Mark-up Language) converter. To obtain benefits of the suggested approaches and design mechanisms in SMP2 standard as far as possible, the object-oriented and component-based design concepts were strictly chosen throughout a whole model development process.

  17. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation.

    PubMed

    Kim, Sangroh; Yoshizumi, Terry T; Yin, Fang-Fang; Chetty, Indrin J

    2013-04-21

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan-scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the 'ISource = 8: Phase-Space Source Incident from Multiple Directions' in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the spiral

  18. Spiral computed tomography phase-space source model in the BEAMnrc/EGSnrc Monte Carlo system: implementation and validation

    NASA Astrophysics Data System (ADS)

    Kim, Sangroh; Yoshizumi, Terry T.; Yin, Fang-Fang; Chetty, Indrin J.

    2013-04-01

    Currently, the BEAMnrc/EGSnrc Monte Carlo (MC) system does not provide a spiral CT source model for the simulation of spiral CT scanning. We developed and validated a spiral CT phase-space source model in the BEAMnrc/EGSnrc system. The spiral phase-space source model was implemented in the DOSXYZnrc user code of the BEAMnrc/EGSnrc system by analyzing the geometry of spiral CT scan—scan range, initial angle, rotational direction, pitch, slice thickness, etc. Table movement was simulated by changing the coordinates of the isocenter as a function of beam angles. Some parameters such as pitch, slice thickness and translation per rotation were also incorporated into the model to make the new phase-space source model, designed specifically for spiral CT scan simulations. The source model was hard-coded by modifying the ‘ISource = 8: Phase-Space Source Incident from Multiple Directions’ in the srcxyznrc.mortran and dosxyznrc.mortran files in the DOSXYZnrc user code. In order to verify the implementation, spiral CT scans were simulated in a CT dose index phantom using the validated x-ray tube model of a commercial CT simulator for both the original multi-direction source (ISOURCE = 8) and the new phase-space source model in the DOSXYZnrc system. Then the acquired 2D and 3D dose distributions were analyzed with respect to the input parameters for various pitch values. In addition, surface-dose profiles were also measured for a patient CT scan protocol using radiochromic film and were compared with the MC simulations. The new phase-space source model was found to simulate the spiral CT scanning in a single simulation run accurately. It also produced the equivalent dose distribution of the ISOURCE = 8 model for the same CT scan parameters. The MC-simulated surface profiles were well matched to the film measurement overall within 10%. The new spiral CT phase-space source model was implemented in the BEAMnrc/EGSnrc system. This work will be beneficial in estimating the

  19. Simulated ventriculostomy training with conventional neuronavigational equipment used clinically in the operating room: prospective validation study.

    PubMed

    Kirkman, Matthew A; Muirhead, William; Sevdalis, Nick; Nandi, Dipankar

    2015-01-01

    Simulation is gaining increasing interest as a method of delivering high-quality, time-effective, and safe training to neurosurgical residents. However, most current simulators are purpose-built for simulation, being relatively expensive and inaccessible to many residents. The purpose of this study was to provide the first comprehensive validity assessment of ventriculostomy performance metrics from the Medtronic StealthStation S7 Surgical Navigation System, a neuronavigational tool widely used in the clinical setting, as a training tool for simulated ventriculostomy while concomitantly reporting on stress measures. A prospective study where participants performed 6 simulated ventriculostomy attempts on a model head with StealthStation-coregistered imaging. The performance measures included distance of the ventricular catheter tip to the foramen of Monro and presence of the catheter tip in the ventricle. Data on objective and self-reported stress and workload measures were also collected. The operating rooms of the National Hospital for Neurology and Neurosurgery, Queen Square, London. A total of 31 individuals with varying levels of prior ventriculostomy experience, varying in seniority from medical student to senior resident. Performance at simulated ventriculostomy improved significantly over subsequent attempts, irrespective of previous ventriculostomy experience. Performance improved whether or not the StealthStation display monitor was used for real-time visual feedback, but performance was optimal when it was. Further, performance was inversely correlated with both objective and self-reported measures of stress (traditionally referred to as concurrent validity). Stress and workload measures were well-correlated with each other, and they also correlated with technical performance. These initial data support the use of the StealthStation as a training tool for simulated ventriculostomy, providing a safe environment for repeated practice with immediate feedback

  20. Failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.

    2013-08-01

    Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.

  1. Flexible Programmes in Higher Professional Education: Expert Validation of a Flexible Educational Model

    ERIC Educational Resources Information Center

    Schellekens, Ad; Paas, Fred; Verbraeck, Alexander; van Merrienboer, Jeroen J. G.

    2010-01-01

    In a preceding case study, a process-focused demand-driven approach for organising flexible educational programmes in higher professional education (HPE) was developed. Operations management and instructional design contributed to designing a flexible educational model by means of discrete-event simulation. Educational experts validated the model…

  2. Improvement on a simplified model for protein folding simulation.

    PubMed

    Zhang, Ming; Chen, Changjun; He, Yi; Xiao, Yi

    2005-11-01

    Improvements were made on a simplified protein model--the Ramachandran model-to achieve better computer simulation of protein folding. To check the validity of such improvements, we chose the ultrafast folding protein Engrailed Homeodomain as an example and explored several aspects of its folding. The engrailed homeodomain is a mainly alpha-helical protein of 61 residues from Drosophila melanogaster. We found that the simplified model of Engrailed Homeodomain can fold into a global minimum state with a tertiary structure in good agreement with its native structure.

  3. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series.

    PubMed

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  4. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series

    NASA Astrophysics Data System (ADS)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  5. Development and validation of a cost-utility model for Type 1 diabetes mellitus.

    PubMed

    Wolowacz, S; Pearson, I; Shannon, P; Chubb, B; Gundgaard, J; Davies, M; Briggs, A

    2015-08-01

    To develop a health economic model to evaluate the cost-effectiveness of new interventions for Type 1 diabetes mellitus by their effects on long-term complications (measured through mean HbA1c ) while capturing the impact of treatment on hypoglycaemic events. Through a systematic review, we identified complications associated with Type 1 diabetes mellitus and data describing the long-term incidence of these complications. An individual patient simulation model was developed and included the following complications: cardiovascular disease, peripheral neuropathy, microalbuminuria, end-stage renal disease, proliferative retinopathy, ketoacidosis, cataract, hypoglycemia and adverse birth outcomes. Risk equations were developed from published cumulative incidence data and hazard ratios for the effect of HbA1c , age and duration of diabetes. We validated the model by comparing model predictions with observed outcomes from studies used to build the model (internal validation) and from other published data (external validation). We performed illustrative analyses for typical patient cohorts and a hypothetical intervention. Model predictions were within 2% of expected values in the internal validation and within 8% of observed values in the external validation (percentages represent absolute differences in the cumulative incidence). The model utilized high-quality, recent data specific to people with Type 1 diabetes mellitus. In the model validation, results deviated less than 8% from expected values. © 2014 Research Triangle Institute d/b/a RTI Health Solutions. Diabetic Medicine © 2014 Diabetes UK.

  6. Comparison of existing models to simulate anaerobic digestion of lipid-rich waste.

    PubMed

    Béline, F; Rodriguez-Mendez, R; Girault, R; Bihan, Y Le; Lessard, P

    2017-02-01

    Models for anaerobic digestion of lipid-rich waste taking inhibition into account were reviewed and, if necessary, adjusted to the ADM1 model framework in order to compare them. Experimental data from anaerobic digestion of slaughterhouse waste at an organic loading rate (OLR) ranging from 0.3 to 1.9kgVSm -3 d -1 were used to compare and evaluate models. Experimental data obtained at low OLRs were accurately modeled whatever the model thereby validating the stoichiometric parameters used and influent fractionation. However, at higher OLRs, although inhibition parameters were optimized to reduce differences between experimental and simulated data, no model was able to accurately simulate accumulation of substrates and intermediates, mainly due to the wrong simulation of pH. A simulation using pH based on experimental data showed that acetogenesis and methanogenesis were the most sensitive steps to LCFA inhibition and enabled identification of the inhibition parameters of both steps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. The YAV-8B simulation and modeling. Volume 2: Program listing

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Detailed mathematical models of varying complexity representative of the YAV-8B aircraft are defined and documented. These models are used in parameter estimation and in linear analysis computer programs while investigating YAV-8B aircraft handling qualities. Both a six degree of freedom nonlinear model and a linearized three degree of freedom longitudinal and lateral directional model were developed. The nonlinear model is based on the mathematical model used on the MCAIR YAV-8B manned flight simulator. This simulator model has undergone periodic updating based on the results of approximately 360 YAV-8B flights and 8000 hours of wind tunnel testing. Qualified YAV-8B flight test pilots have commented that the handling qualities characteristics of the simulator are quite representative of the real aircraft. These comments are validated herein by comparing data from both static and dynamic flight test maneuvers to the same obtained using the nonlinear program.

  8. Simulating the Cyclone Induced Turbulent Mixing in the Bay of Bengal using COAWST Model

    NASA Astrophysics Data System (ADS)

    Prakash, K. R.; Nigam, T.; Pant, V.

    2017-12-01

    Mixing in the upper oceanic layers (up to a few tens of meters from surface) is an important process to understand the evolution of sea surface properties. Enhanced mixing due to strong wind forcing at surface leads to deepening of mixed layer that affects the air-sea exchange of heat and momentum fluxes and modulates sea surface temperature (SST). In the present study, we used Coupled-Ocean-Atmosphere-Wave-Sediment Transport (COAWST) model to demonstrate and quantify the enhanced cyclone induced turbulent mixing in case of a severe cyclonic storm. The COAWST model was configured over the Bay of Bengal (BoB) and used to simulate the atmospheric and oceanic conditions prevailing during the tropical cyclone (TC) Phailin that occurred over the BoB during 10-15 October 2013. The model simulated cyclone track was validated with IMD best-track and model SST validated with daily AVHRR SST data. Validation shows that model simulated track & intensity, SST and salinity were in good agreement with observations and the cyclone induced cooling of the sea surface was well captured by the model. Model simulations show a considerable deepening (by 10-15 m) of the mixed layer and shoaling of thermocline during TC Phailin. The power spectrum analysis was performed on the zonal and meridional baroclinic current components, which shows strongest energy at 14 m depth. Model results were analyzed to investigate the non-uniform energy distribution in the water column from surface up to the thermocline depth. The rotary spectra analysis highlights the downward direction of turbulent mixing during the TC Phailin period. Model simulations were used to quantify and interpret the near-inertial mixing, which were generated by cyclone induced strong wind stress and the near-inertial energy. These near-inertial oscillations are responsible for the enhancement of the mixing operative in the strong post-monsoon (October-November) stratification in the BoB.

  9. A posteriori model validation for the temporal order of directed functional connectivity maps

    PubMed Central

    Beltz, Adriene M.; Molenaar, Peter C. M.

    2015-01-01

    A posteriori model validation for the temporal order of neural directed functional connectivity maps is rare. This is striking because models that require sequential independence among residuals are regularly implemented. The aim of the current study was (a) to apply to directed functional connectivity maps of functional magnetic resonance imaging data an a posteriori model validation procedure (i.e., white noise tests of one-step-ahead prediction errors combined with decision criteria for revising the maps based upon Lagrange Multiplier tests), and (b) to demonstrate how the procedure applies to single-subject simulated, single-subject task-related, and multi-subject resting state data. Directed functional connectivity was determined by the unified structural equation model family of approaches in order to map contemporaneous and first order lagged connections among brain regions at the group- and individual-levels while incorporating external input, then white noise tests were run. Findings revealed that the validation procedure successfully detected unmodeled sequential dependencies among residuals and recovered higher order (greater than one) simulated connections, and that the procedure can accommodate task-related input. Findings also revealed that lags greater than one were present in resting state data: With a group-level network that contained only contemporaneous and first order connections, 44% of subjects required second order, individual-level connections in order to obtain maps with white noise residuals. Results have broad methodological relevance (e.g., temporal validation is necessary after directed functional connectivity analyses because the presence of unmodeled higher order sequential dependencies may bias parameter estimates) and substantive implications (e.g., higher order lags may be common in resting state data). PMID:26379489

  10. A posteriori model validation for the temporal order of directed functional connectivity maps.

    PubMed

    Beltz, Adriene M; Molenaar, Peter C M

    2015-01-01

    A posteriori model validation for the temporal order of neural directed functional connectivity maps is rare. This is striking because models that require sequential independence among residuals are regularly implemented. The aim of the current study was (a) to apply to directed functional connectivity maps of functional magnetic resonance imaging data an a posteriori model validation procedure (i.e., white noise tests of one-step-ahead prediction errors combined with decision criteria for revising the maps based upon Lagrange Multiplier tests), and (b) to demonstrate how the procedure applies to single-subject simulated, single-subject task-related, and multi-subject resting state data. Directed functional connectivity was determined by the unified structural equation model family of approaches in order to map contemporaneous and first order lagged connections among brain regions at the group- and individual-levels while incorporating external input, then white noise tests were run. Findings revealed that the validation procedure successfully detected unmodeled sequential dependencies among residuals and recovered higher order (greater than one) simulated connections, and that the procedure can accommodate task-related input. Findings also revealed that lags greater than one were present in resting state data: With a group-level network that contained only contemporaneous and first order connections, 44% of subjects required second order, individual-level connections in order to obtain maps with white noise residuals. Results have broad methodological relevance (e.g., temporal validation is necessary after directed functional connectivity analyses because the presence of unmodeled higher order sequential dependencies may bias parameter estimates) and substantive implications (e.g., higher order lags may be common in resting state data).

  11. Validation of MHD Models using MST RFP Plasmas

    NASA Astrophysics Data System (ADS)

    Jacobson, C. M.; Chapman, B. E.; den Hartog, D. J.; McCollam, K. J.; Sarff, J. S.; Sovinec, C. R.

    2017-10-01

    Rigorous validation of computational models used in fusion energy sciences over a large parameter space and across multiple magnetic configurations can increase confidence in their ability to predict the performance of future devices. MST is a well diagnosed reversed-field pinch (RFP) capable of operation with plasma current ranging from 60 kA to 500 kA. The resulting Lundquist number S, a key parameter in resistive magnetohydrodynamics (MHD), ranges from 4 ×104 to 8 ×106 for standard RFP plasmas and provides substantial overlap with MHD RFP simulations. MST RFP plasmas are simulated using both DEBS, a nonlinear single-fluid visco-resistive MHD code, and NIMROD, a nonlinear extended MHD code, with S ranging from 104 to 105 for single-fluid runs, and the magnetic Prandtl number Pm = 1 . Validation metric comparisons are presented, focusing on how normalized magnetic fluctuations at the edge b scale with S. Preliminary results for the dominant n = 6 mode are b S - 0 . 20 +/- 0 . 02 for single-fluid NIMROD, b S - 0 . 25 +/- 0 . 05 for DEBS, and b S - 0 . 20 +/- 0 . 02 for experimental measurements, however there is a significant discrepancy in mode amplitudes. Preliminary two-fluid NIMROD results are also presented. Work supported by US DOE.

  12. Numerical Simulations of a Multiscale Model of Stratified Langmuir Circulation

    NASA Astrophysics Data System (ADS)

    Malecha, Ziemowit; Chini, Gregory; Julien, Keith

    2012-11-01

    Langmuir circulation (LC), a prominent form of wind and surface-wave driven shear turbulence in the ocean surface boundary layer (BL), is commonly modeled using the Craik-Leibovich (CL) equations, a phase-averaged variant of the Navier-Stokes (NS) equations. Although surface-wave filtering renders the CL equations more amenable to simulation than are the instantaneous NS equations, simulations in wide domains, hundreds of times the BL depth, currently earn the ``grand challenge'' designation. To facilitate simulations of LC in such spatially-extended domains, we have derived multiscale CL equations by exploiting the scale separation between submesoscale and BL flows in the upper ocean. The numerical algorithm for simulating this multiscale model resembles super-parameterization schemes used in meteorology, but retains a firm mathematical basis. We have validated our algorithm and here use it to perform multiscale simulations of the interaction between LC and upper ocean density stratification. ZMM, GPC, KJ gratefully acknowledge funding from NSF CMG Award 0934827.

  13. A finite element model of a six-year-old child for simulating pedestrian accidents.

    PubMed

    Meng, Yunzhu; Pak, Wansoo; Guleyupoglu, Berkan; Koya, Bharath; Gayzik, F Scott; Untaroiu, Costin D

    2017-01-01

    Child pedestrian protection deserves more attention in vehicle safety design since they are the most vulnerable road users who face the highest mortality rate. Pediatric Finite Element (FE) models could be used to simulate and understand the pedestrian injury mechanisms during crashes in order to mitigate them. Thus, the objective of the study was to develop a computationally efficient (simplified) six-year-old (6YO-PS) pedestrian FE model and validate it based on the latest published pediatric data. The 6YO-PS FE model was developed by morphing the existing GHBMC adult pedestrian model. Retrospective scan data were used to locally adjust the geometry as needed for accuracy. Component test simulations focused only the lower extremities and pelvis, which are the first body regions impacted during pedestrian accidents. Three-point bending test simulations were performed on the femur and tibia with adult material properties and then updated using child material properties. Pelvis impact and knee bending tests were also simulated. Finally, a series of pediatric Car-to-Pedestrian Collision (CPC) were simulated with pre-impact velocities ranging from 20km/h up to 60km/h. The bone models assigned pediatric material properties showed lower stiffness and a good match in terms of fracture force to the test data (less than 6% error). The pelvis impact force predicted by the child model showed a similar trend with test data. The whole pedestrian model was stable during CPC simulations and predicted common pedestrian injuries. Overall, the 6YO-PS FE model developed in this study showed good biofidelity at component level (lower extremity and pelvis) and stability in CPC simulations. While more validations would improve it, the current model could be used to investigate the lower limb injury mechanisms and in the prediction of the impact parameters as specified in regulatory testing protocols. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. MRI-based modeling for radiocarpal joint mechanics: validation criteria and results for four specimen-specific models.

    PubMed

    Fischer, Kenneth J; Johnson, Joshua E; Waller, Alexander J; McIff, Terence E; Toby, E Bruce; Bilgen, Mehmet

    2011-10-01

    The objective of this study was to validate the MRI-based joint contact modeling methodology in the radiocarpal joints by comparison of model results with invasive specimen-specific radiocarpal contact measurements from four cadaver experiments. We used a single validation criterion for multiple outcome measures to characterize the utility and overall validity of the modeling approach. For each experiment, a Pressurex film and a Tekscan sensor were sequentially placed into the radiocarpal joints during simulated grasp. Computer models were constructed based on MRI visualization of the cadaver specimens without load. Images were also acquired during the loaded configuration used with the direct experimental measurements. Geometric surface models of the radius, scaphoid and lunate (including cartilage) were constructed from the images acquired without the load. The carpal bone motions from the unloaded state to the loaded state were determined using a series of 3D image registrations. Cartilage thickness was assumed uniform at 1.0 mm with an effective compressive modulus of 4 MPa. Validation was based on experimental versus model contact area, contact force, average contact pressure and peak contact pressure for the radioscaphoid and radiolunate articulations. Contact area was also measured directly from images acquired under load and compared to the experimental and model data. Qualitatively, there was good correspondence between the MRI-based model data and experimental data, with consistent relative size, shape and location of radioscaphoid and radiolunate contact regions. Quantitative data from the model generally compared well with the experimental data for all specimens. Contact area from the MRI-based model was very similar to the contact area measured directly from the images. For all outcome measures except average and peak pressures, at least two specimen models met the validation criteria with respect to experimental measurements for both articulations

  15. Systematic approach to verification and validation: High explosive burn models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menikoff, Ralph; Scovel, Christina A.

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the samemore » experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a

  16. Uterus models for use in virtual reality hysteroscopy simulators.

    PubMed

    Niederer, Peter; Weiss, Stephan; Caduff, Rosmarie; Bajka, Michael; Szekély, Gabor; Harders, Matthias

    2009-05-01

    Virtual reality models of human organs are needed in surgery simulators which are developed for educational and training purposes. A simulation can only be useful, however, if the mechanical performance of the system in terms of force-feedback for the user as well as the visual representation is realistic. We therefore aim at developing a mechanical computer model of the organ in question which yields realistic force-deformation behavior under virtual instrument-tissue interactions and which, in particular, runs in real time. The modeling of the human uterus is described as it is to be implemented in a simulator for minimally invasive gynecological procedures. To this end, anatomical information which was obtained from specially designed computed tomography and magnetic resonance imaging procedures as well as constitutive tissue properties recorded from mechanical testing were used. In order to achieve real-time performance, the combination of mechanically realistic numerical uterus models of various levels of complexity with a statistical deformation approach is suggested. In view of mechanical accuracy of such models, anatomical characteristics including the fiber architecture along with the mechanical deformation properties are outlined. In addition, an approach to make this numerical representation potentially usable in an interactive simulation is discussed. The numerical simulation of hydrometra is shown in this communication. The results were validated experimentally. In order to meet the real-time requirements and to accommodate the large biological variability associated with the uterus, a statistical modeling approach is demonstrated to be useful.

  17. A Finite Element Model of a Midsize Male for Simulating Pedestrian Accidents.

    PubMed

    Untaroiu, Costin D; Pak, Wansoo; Meng, Yunzhu; Schap, Jeremy; Koya, Bharath; Gayzik, Scott

    2018-01-01

    Pedestrians represent one of the most vulnerable road users and comprise nearly 22% the road crash-related fatalities in the world. Therefore, protection of pedestrians in car-to-pedestrian collisions (CPC) has recently generated increased attention with regulations involving three subsystem tests. The development of a finite element (FE) pedestrian model could provide a complementary component that characterizes the whole-body response of vehicle-pedestrian interactions and assesses the pedestrian injuries. The main goal of this study was to develop and to validate a simplified full body FE model corresponding to a 50th male pedestrian in standing posture (M50-PS). The FE model mesh and defined material properties are based on a 50th percentile male occupant model. The lower limb-pelvis and lumbar spine regions of the human model were validated against the postmortem human surrogate (PMHS) test data recorded in four-point lateral knee bending tests, pelvic\\abdomen\\shoulder\\thoracic impact tests, and lumbar spine bending tests. Then, a pedestrian-to-vehicle impact simulation was performed using the whole pedestrian model, and the results were compared to corresponding PMHS tests. Overall, the simulation results showed that lower leg response is mostly within the boundaries of PMHS corridors. In addition, the model shows the capability to predict the most common lower extremity injuries observed in pedestrian accidents. Generally, the validated pedestrian model may be used by safety researchers in the design of front ends of new vehicles in order to increase pedestrian protection.

  18. Simplified human model and pedestrian simulation in the millimeter-wave region

    NASA Astrophysics Data System (ADS)

    Han, Junghwan; Kim, Seok; Lee, Tae-Yun; Ka, Min-Ho

    2016-02-01

    The 24 GHz and 77 GHz radar sensors have been studied as a strong candidate for advanced driver assistance systems(ADAS) because of their all-weather capability and accurate range and radial velocity measuring scheme. However, developing a reliable pedestrian recognition system hasmany obstacles due to the inaccurate and non-trivial radar responses at these high frequencies and the many combinations of clothes and accessories. To overcome these obstacles, many researchers used electromagnetic (EM) simulation to characterize the radar scattering response of a human. However, human simulation takes so long time because of the electrically huge size of a human in the millimeter-wave region. To reduce simulation time, some researchers assumed the skin of a human is the perfect electric conductor (PEC) and have simulated the PEC human model using physical optics (PO) algorithm without a specific explanation about how the human body could be modeled with PEC. In this study, the validity of the assumption that the surface of the human body is considered PEC in the EM simulation is verified, and the simulation result of the dry skin human model is compared with that of the PEC human model.

  19. A virtual source model for Monte Carlo simulation of helical tomotherapy.

    PubMed

    Yuan, Jiankui; Rong, Yi; Chen, Quan

    2015-01-08

    The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase-space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS-generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of < 1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of < 2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM-based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose-volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were

  20. A virtual source model for Monte Carlo simulation of helical tomotherapy

    PubMed Central

    Yuan, Jiankui; Rong, Yi

    2015-01-01

    The purpose of this study was to present a Monte Carlo (MC) simulation method based on a virtual source, jaw, and MLC model to calculate dose in patient for helical tomotherapy without the need of calculating phase‐space files (PSFs). Current studies on the tomotherapy MC simulation adopt a full MC model, which includes extensive modeling of radiation source, primary and secondary jaws, and multileaf collimator (MLC). In the full MC model, PSFs need to be created at different scoring planes to facilitate the patient dose calculations. In the present work, the virtual source model (VSM) we established was based on the gold standard beam data of a tomotherapy unit, which can be exported from the treatment planning station (TPS). The TPS‐generated sinograms were extracted from the archived patient XML (eXtensible Markup Language) files. The fluence map for the MC sampling was created by incorporating the percentage leaf open time (LOT) with leaf filter, jaw penumbra, and leaf latency contained from sinogram files. The VSM was validated for various geometry setups and clinical situations involving heterogeneous media and delivery quality assurance (DQA) cases. An agreement of <1% was obtained between the measured and simulated results for percent depth doses (PDDs) and open beam profiles for all three jaw settings in the VSM commissioning. The accuracy of the VSM leaf filter model was verified in comparing the measured and simulated results for a Picket Fence pattern. An agreement of <2% was achieved between the presented VSM and a published full MC model for heterogeneous phantoms. For complex clinical head and neck (HN) cases, the VSM‐based MC simulation of DQA plans agreed with the film measurement with 98% of planar dose pixels passing on the 2%/2 mm gamma criteria. For patient treatment plans, results showed comparable dose‐volume histograms (DVHs) for planning target volumes (PTVs) and organs at risk (OARs). Deviations observed in this study were

  1. TUTORIAL: Validating biorobotic models

    NASA Astrophysics Data System (ADS)

    Webb, Barbara

    2006-09-01

    Some issues in neuroscience can be addressed by building robot models of biological sensorimotor systems. What we can conclude from building models or simulations, however, is determined by a number of factors in addition to the central hypothesis we intend to test. These include the way in which the hypothesis is represented and implemented in simulation, how the simulation output is interpreted, how it is compared to the behaviour of the biological system, and the conditions under which it is tested. These issues will be illustrated by discussing a series of robot models of cricket phonotaxis behaviour. .

  2. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Runyon, Christopher

    2014-01-01

    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  3. The Aliso Canyon Natural Gas Leak : Large Eddy Simulations for Modeling Atmospheric Dynamics and Interpretation of Observations.

    NASA Astrophysics Data System (ADS)

    Prasad, K.; Thorpe, A. K.; Duren, R. M.; Thompson, D. R.; Whetstone, J. R.

    2016-12-01

    The National Institute of Standards and Technology (NIST) has supported the development and demonstration of a measurement capability to accurately locate greenhouse gas sources and measure their flux to the atmosphere over urban domains. However, uncertainties in transport models which form the basis of all top-down approaches can significantly affect our capability to attribute sources and predict their flux to the atmosphere. Reducing uncertainties between bottom-up and top-down models will require high resolution transport models as well as validation and verification of dispersion models over an urban domain. Tracer experiments involving the release of Perfluorocarbon Tracers (PFTs) at known flow rates offer the best approach for validating dispersion / transport models. However, tracer experiments are limited by cost, ability to make continuous measurements, and environmental concerns. Natural tracer experiments, such as the leak from the Aliso Canyon underground storage facility offers a unique opportunity to improve and validate high resolution transport models, test leak hypothesis, and to estimate the amount of methane released.High spatial resolution (10 m) Large Eddy Simulations (LES) coupled with WRF atmospheric transport models were performed to simulate the dynamics of the Aliso Canyon methane plume and to quantify the source. High resolution forward simulation results were combined with aircraft and tower based in-situ measurements as well as data from NASA airborne imaging spectrometers. Comparison of simulation results with measurement data demonstrate the capability of the LES models to accurately model transport and dispersion of methane plumes over urban domains.

  4. Simulation and experimental validation of the dynamical model of a dual-rotor vibrotactor

    NASA Astrophysics Data System (ADS)

    Miklós, Á.; Szabó, Z.

    2015-01-01

    In this work, a novel design for small vibrotactors called the Dual Excenter is presented, which makes it possible to produce vibrations with independently adjustable frequency and amplitude. This feature has been realized using two coaxially aligned eccentric rotors, which are driven by DC motors independently. The prototype of the device has been built, where mechanical components are integrated on a frame with two optical sensors for the measurement of angular velocity and phase angle. The system is equipped with a digital controller. Simulations confirm the results of analytical investigations and they allow us to model the sampling method of the signals of the angular velocity and the phase angle between the rotors. Furthermore, we model the discrete behavior of the controller, which is a PI controller for the angular velocities and a PID controller for the phase angle. Finally, simulation results are compared to experimental ones, which show that the Dual Excenter concept is feasible.

  5. Development and validation of age-dependent FE human models of a mid-sized male thorax.

    PubMed

    El-Jawahri, Raed E; Laituri, Tony R; Ruan, Jesse S; Rouhana, Stephen W; Barbat, Saeed D

    2010-11-01

    The increasing number of people over 65 years old (YO) is an important research topic in the area of impact biomechanics, and finite element (FE) modeling can provide valuable support for related research. There were three objectives of this study: (1) Estimation of the representative age of the previously-documented Ford Human Body Model (FHBM) -- an FE model which approximates the geometry and mass of a mid-sized male, (2) Development of FE models representing two additional ages, and (3) Validation of the resulting three models to the extent possible with respect to available physical tests. Specifically, the geometry of the model was compared to published data relating rib angles to age, and the mechanical properties of different simulated tissues were compared to a number of published aging functions. The FHBM was determined to represent a 53-59 YO mid-sized male. The aforementioned aging functions were used to develop FE models representing two additional ages: 35 and 75 YO. The rib model was validated against human rib specimens and whole rib tests, under different loading conditions, with and without modeled fracture. In addition, the resulting three age-dependent models were validated by simulating cadaveric tests of blunt and sled impacts. The responses of the models, in general, were within the cadaveric response corridors. When compared to peak responses from individual cadavers similar in size and age to the age-dependent models, some responses were within one standard deviation of the test data. All the other responses, but one, were within two standard deviations.

  6. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    PubMed

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  7. Validation of a novel basic virtual reality simulator, the LAP-X, for training basic laparoscopic skills.

    PubMed

    Kawaguchi, Koji; Egi, Hiroyuki; Hattori, Minoru; Sawada, Hiroyuki; Suzuki, Takahisa; Ohdan, Hideki

    2014-10-01

    Virtual reality surgical simulators are becoming popular as a means of providing trainees with an opportunity to practice laparoscopic skills. The Lap-X (Epona Medical, Rotterdam, the Netherlands) is a novel VR simulator for training basic skills in laparoscopic surgery. The objective of this study was to validate the LAP-X laparoscopic virtual reality simulator by assessing the face and construct validity in order to determine whether the simulator is adequate for basic skills training. The face and content validity were evaluated using a structured questionnaire. To assess the construct validity, the participants, nine expert surgeons (median age: 40 (32-45)) (>100 laparoscopic procedures) and 11 novices performed three basic laparoscopic tasks using the Lap-X. The participants reported a high level of content validity. No significant differences were found between the expert surgeons and the novices (Ps > 0.246). The performance of the expert surgeons on the three tasks was significantly better than that of the novices in all parameters (Ps < 0.05). This study demonstrated the face, content and construct validity of the Lap-X. The Lap-X holds real potential as a home and hospital training device.

  8. Enabling co-simulation of tokamak plant models and plasma control systems

    DOE PAGES

    Walker, M. L.

    2017-12-22

    A system for connecting the Plasma Control System and a model of the tokamak Plant in closed loop co-simulation for plasma control development has been in routine use at DIII-D for more than 20 years and at other fusion labs that use variants of the DIII-D PCS for approximately the last decade. Here, co-simulation refers to the simultaneous execution of two independent codes with the exchange of data - Plant actuator commands and tokamak diagnostic data - between them during execution. Interest in this type of PCS-Plant simulation technology has also been growing recently at other fusion facilities. In fact,more » use of such closed loop control simulations is assumed to play an even larger role in the development of both the ITER Plasma Control System (PCS) and the experimental operation of the ITER device, where they will be used to support verification/validation of the PCS and also for ITER pulse schedule development and validation. We describe the key use cases that motivate the co-simulation capability and the features that must be provided by the Plasma Control System to support it. These features could be provided by the PCS itself or by a model of the PCS. If the PCS itself is chosen to provide them, there are requirements imposed on its architecture. If a PCS model is chosen, there are requirements imposed on the initial implementation of this simulation as well as long-term consequences for its continued development and maintenance. We describe these issues for each use case and discuss the relative merits of the two choices. Several examples are given illustrating uses of the co-simulation method to address problems of plasma control during the operation of DIII-D and of other devices that use the DIII-D PCS.« less

  9. Enabling co-simulation of tokamak plant models and plasma control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, M. L.

    A system for connecting the Plasma Control System and a model of the tokamak Plant in closed loop co-simulation for plasma control development has been in routine use at DIII-D for more than 20 years and at other fusion labs that use variants of the DIII-D PCS for approximately the last decade. Here, co-simulation refers to the simultaneous execution of two independent codes with the exchange of data - Plant actuator commands and tokamak diagnostic data - between them during execution. Interest in this type of PCS-Plant simulation technology has also been growing recently at other fusion facilities. In fact,more » use of such closed loop control simulations is assumed to play an even larger role in the development of both the ITER Plasma Control System (PCS) and the experimental operation of the ITER device, where they will be used to support verification/validation of the PCS and also for ITER pulse schedule development and validation. We describe the key use cases that motivate the co-simulation capability and the features that must be provided by the Plasma Control System to support it. These features could be provided by the PCS itself or by a model of the PCS. If the PCS itself is chosen to provide them, there are requirements imposed on its architecture. If a PCS model is chosen, there are requirements imposed on the initial implementation of this simulation as well as long-term consequences for its continued development and maintenance. We describe these issues for each use case and discuss the relative merits of the two choices. Several examples are given illustrating uses of the co-simulation method to address problems of plasma control during the operation of DIII-D and of other devices that use the DIII-D PCS.« less

  10. Automated finite element meshing of the lumbar spine: Verification and validation with 18 specimen-specific models.

    PubMed

    Campbell, J Q; Coombs, D J; Rao, M; Rullkoetter, P J; Petrella, A J

    2016-09-06

    The purpose of this study was to seek broad verification and validation of human lumbar spine finite element models created using a previously published automated algorithm. The automated algorithm takes segmented CT scans of lumbar vertebrae, automatically identifies important landmarks and contact surfaces, and creates a finite element model. Mesh convergence was evaluated by examining changes in key output variables in response to mesh density. Semi-direct validation was performed by comparing experimental results for a single specimen to the automated finite element model results for that specimen with calibrated material properties from a prior study. Indirect validation was based on a comparison of results from automated finite element models of 18 individual specimens, all using one set of generalized material properties, to a range of data from the literature. A total of 216 simulations were run and compared to 186 experimental data ranges in all six primary bending modes up to 7.8Nm with follower loads up to 1000N. Mesh convergence results showed less than a 5% difference in key variables when the original mesh density was doubled. The semi-direct validation results showed that the automated method produced results comparable to manual finite element modeling methods. The indirect validation results showed a wide range of outcomes due to variations in the geometry alone. The studies showed that the automated models can be used to reliably evaluate lumbar spine biomechanics, specifically within our intended context of use: in pure bending modes, under relatively low non-injurious simulated in vivo loads, to predict torque rotation response, disc pressures, and facet forces. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Validation of a mixture-averaged thermal diffusion model for premixed lean hydrogen flames

    NASA Astrophysics Data System (ADS)

    Schlup, Jason; Blanquart, Guillaume

    2018-03-01

    The mixture-averaged thermal diffusion model originally proposed by Chapman and Cowling is validated using multiple flame configurations. Simulations using detailed hydrogen chemistry are done on one-, two-, and three-dimensional flames. The analysis spans flat and stretched, steady and unsteady, and laminar and turbulent flames. Quantitative and qualitative results using the thermal diffusion model compare very well with the more complex multicomponent diffusion model. Comparisons are made using flame speeds, surface areas, species profiles, and chemical source terms. Once validated, this model is applied to three-dimensional laminar and turbulent flames. For these cases, thermal diffusion causes an increase in the propagation speed of the flames as well as increased product chemical source terms in regions of high positive curvature. The results illustrate the necessity for including thermal diffusion, and the accuracy and computational efficiency of the mixture-averaged thermal diffusion model.

  12. TOUGH-RBSN simulator for hydraulic fracture propagation within fractured media: Model validations against laboratory experiments

    NASA Astrophysics Data System (ADS)

    Kim, Kunhwi; Rutqvist, Jonny; Nakagawa, Seiji; Birkholzer, Jens

    2017-11-01

    This paper presents coupled hydro-mechanical modeling of hydraulic fracturing processes in complex fractured media using a discrete fracture network (DFN) approach. The individual physical processes in the fracture propagation are represented by separate program modules: the TOUGH2 code for multiphase flow and mass transport based on the finite volume approach; and the rigid-body-spring network (RBSN) model for mechanical and fracture-damage behavior, which are coupled with each other. Fractures are modeled as discrete features, of which the hydrological properties are evaluated from the fracture deformation and aperture change. The verification of the TOUGH-RBSN code is performed against a 2D analytical model for single hydraulic fracture propagation. Subsequently, modeling capabilities for hydraulic fracturing are demonstrated through simulations of laboratory experiments conducted on rock-analogue (soda-lime glass) samples containing a designed network of pre-existing fractures. Sensitivity analyses are also conducted by changing the modeling parameters, such as viscosity of injected fluid, strength of pre-existing fractures, and confining stress conditions. The hydraulic fracturing characteristics attributed to the modeling parameters are investigated through comparisons of the simulation results.

  13. SAGE Validations of Volcanic Jet Simulations

    NASA Astrophysics Data System (ADS)

    Peterson, A. H.; Wohletz, K. H.; Ogden, D. E.; Gisler, G.; Glatzmaier, G.

    2006-12-01

    The SAGE (SAIC Adaptive Grid Eulerian) code employs adaptive mesh refinement in solving Eulerian equations of complex fluid flow desirable for simulation of volcanic eruptions. Preliminary eruption simulations demonstrate its ability to resolve multi-material flows over large domains where dynamics are concentrated in small regions. In order to validate further application of this code to numerical simulation of explosive eruption phenomena, we focus on one of the fundamental physical processes important to the problem, namely the dynamics of an underexpanded jet. Observations of volcanic eruption plumes and laboratory experiments on analog systems document the eruption of overpressured fluid in a supersonic jet that is governed by vent diameter and level of overpressure. The jet is dominated by inertia (very high Reynolds number) and feeds a thermally convective plume controlled by turbulent admixture of the atmosphere. The height above the vent at which the jet looses its inertia is important to know for convective plume predictions that are used to calculate atmospheric dispersal of volcanic products. We simulate a set of well documented laboratory experiments that provide detail on underexpanded jet structure by gas density contours, showing the shape and size of the Mach stem. SAGE results are within several percent of the experiments for position and density of the incident (intercepting) and reflected shocks, slip lines, shear layers, and Mach disk. The simulations also resolve vorticity at the jet margins near the Mach disk, showing turbulent velocity fields down to a scale of 30 micrometers. Benchmarking these results with those of CFDLib (Los Alamos National Laboratory), which solves the full Navier-Stokes equations (includes viscous stress tensor), shows close agreement, indicating that adaptive mesh refinement used in SAGE may offset the need for explicit calculation of viscous dissipation.

  14. Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Maiti, Raman

    2016-06-01

    The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.

  15. Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Maiti, Raman

    2018-06-01

    The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.

  16. The Mt. Hood challenge: cross-testing two diabetes simulation models.

    PubMed

    Brown, J B; Palmer, A J; Bisgaard, P; Chan, W; Pedula, K; Russell, A

    2000-11-01

    Starting from identical patients with type 2 diabetes, we compared the 20-year predictions of two computer simulation models, a 1998 version of the IMIB model and version 2.17 of the Global Diabetes Model (GDM). Primary measures of outcome were 20-year cumulative rates of: survival, first (incident) acute myocardial infarction (AMI), first stroke, proliferative diabetic retinopathy (PDR), macro-albuminuria (gross proteinuria, or GPR), and amputation. Standardized test patients were newly diagnosed males aged 45 or 75, with high and low levels of glycated hemoglobin (HbA(1c)), systolic blood pressure (SBP), and serum lipids. Both models generated realistic results and appropriate responses to changes in risk factors. Compared with the GDM, the IMIB model predicted much higher rates of mortality and AMI, and fewer strokes. These differences can be explained by differences in model architecture (Markov vs. microsimulation), different evidence bases for cardiovascular prediction (Framingham Heart Study cohort vs. Kaiser Permanente patients), and isolated versus interdependent prediction of cardiovascular events. Compared with IMIB, GDM predicted much higher lifetime costs, because of lower mortality and the use of a different costing method. It is feasible to cross-validate and explicate dissimilar diabetes simulation models using standardized patients. The wide differences in the model results that we observed demonstrate the need for cross-validation. We propose to hold a second 'Mt Hood Challenge' in 2001 and invite all diabetes modelers to attend.

  17. Protocols for efficient simulations of long-time protein dynamics using coarse-grained CABS model.

    PubMed

    Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian

    2014-01-01

    Coarse-grained (CG) modeling is a well-acknowledged simulation approach for getting insight into long-time scale protein folding events at reasonable computational cost. Depending on the design of a CG model, the simulation protocols vary from highly case-specific-requiring user-defined assumptions about the folding scenario-to more sophisticated blind prediction methods for which only a protein sequence is required. Here we describe the framework protocol for the simulations of long-term dynamics of globular proteins, with the use of the CABS CG protein model and sequence data. The simulations can start from a random or a selected (e.g., native) structure. The described protocol has been validated using experimental data for protein folding model systems-the prediction results agreed well with the experimental results.

  18. Limitations in electrophysiological model development and validation caused by differences between simulations and experimental protocols.

    PubMed

    Carro, Jesús; Rodríguez-Matas, José F; Monasterio, Violeta; Pueyo, Esther

    2017-10-01

    Models of ion channel dynamics are usually built by fitting isolated cell experimental values of individual parameters while neglecting the interaction between them. Another shortcoming regards the estimation of ionic current conductances, which is often based on quantification of Action Potential (AP)-derived markers. Although this procedure reduces the uncertainty in the calculation of conductances, many studies evaluate electrophysiological AP-derived markers from single cell simulations, whereas experimental measurements are obtained from tissue preparations. In this work, we explore the limitations of these approaches to estimate ion channel dynamics and maximum current conductances and how they could be overcome by using multiscale simulations of experimental protocols. Four human ventricular cell models, namely ten Tusscher and Panfilov (2006), Grandi et al. (2010), O'Hara et al. (2011), and Carro et al. (2011), were used. Two problems involving scales from ion channels to tissue were investigated: 1) characterization of L-type calcium voltage-dependent inactivation I Ca,L ; 2) identification of major ionic conductance contributors to steady-state AP markers, including APD 90 , APD 75 , APD 50 , APD 25 , Triangulation and maximal and minimal values of V and dV/dt during the AP (V max , V min , dV/dt max , dV/dt min ). Our results show that: 1) I Ca,L inactivation characteristics differed significantly when calculated from model equations and from simulations reproducing the experimental protocols. 2) Large differences were found in the ionic currents contributors to APD 25 , Triangulation, V max , dV/dt max and dV/dt min between single cells and 1D-tissue. When proposing any new model formulation, or evaluating an existing model, consistency between simulated and experimental data should be verified considering all involved effects and scales. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Requirements for Modeling and Simulation for Space Medicine Operations: Preliminary Considerations

    NASA Technical Reports Server (NTRS)

    Dawson, David L.; Billica, Roger D.; Logan, James; McDonald, P. Vernon

    2001-01-01

    The NASA Space Medicine program is now developing plans for more extensive use of high-fidelity medical Simulation systems. The use of simulation is seen as means to more effectively use the limited time available for astronaut medical training. Training systems should be adaptable for use in a variety of training environments, including classrooms or laboratories, space vehicle mockups, analog environments, and in microgravity. Modeling and simulation can also provide the space medicine development program a mechanism for evaluation of other medical technologies under operationally realistic conditions. Systems and procedures need preflight verification with ground-based testing. Traditionally, component testing has been accomplished, but practical means for "human in the loop" verification of patient care systems have been lacking. Medical modeling and simulation technology offer potential means to accomplish such validation work. Initial considerations in the development of functional requirements and design standards for simulation systems for space medicine are discussed.

  20. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    NASA Technical Reports Server (NTRS)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.