Sample records for geostatical simulation methodologies

  1. MODFLOW Ground-Water Model - User Guide to the Subsidence and Aquifer-System Compaction Package (SUB-WT) for Water-Table Aquifers

    USGS Publications Warehouse

    Leake, S.A.; Galloway, D.L.

    2007-01-01

    A new computer program was developed to simulate vertical compaction in models of regional ground-water flow. The program simulates ground-water storage changes and compaction in discontinuous interbeds or in extensive confining units, accounting for stress-dependent changes in storage properties. The new program is a package for MODFLOW, the U.S. Geological Survey modular finite-difference ground-water flow model. Several features of the program make it useful for application in shallow, unconfined flow systems. Geostatic stress can be treated as a function of water-table elevation, and compaction is a function of computed changes in effective stress at the bottom of a model layer. Thickness of compressible sediments in an unconfined model layer can vary in proportion to saturated thickness.

  2. Reservoir property grids improve with geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogt, J.

    1993-09-01

    Visualization software, reservoir simulators and many other E and P software applications need reservoir property grids as input. Using geostatistics, as compared to other gridding methods, to produce these grids leads to the best output from the software programs. For the purpose stated herein, geostatistics is simply two types of gridding methods. Mathematically, these methods are based on minimizing or duplicating certain statistical properties of the input data. One geostatical method, called kriging, is used when the highest possible point-by-point accuracy is desired. The other method, called conditional simulation, is used when one wants statistics and texture of the resultingmore » grid to be the same as for the input data. In the following discussion, each method is explained, compared to other gridding methods, and illustrated through example applications. Proper use of geostatistical data in flow simulations, use of geostatistical data for history matching, and situations where geostatistics has no significant advantage over other methods, also will be covered.« less

  3. 2011 Atlanta, Georgia, Regional Travel Survey | Transportation Secure Data

    Science.gov Websites

    Center | NREL Atlanta, Georgia, Regional Travel Survey 2011 Atlanta, Georgia, Regional Travel Survey To improve regional travel demand forecasts, the 2011 Regional Travel Survey collected trip data the Atlanta Regional Commission (ARC), the survey was conducted by PTV NuStats, GeoStats, and PG

  4. Where Is the Sun at Sunset?

    ERIC Educational Resources Information Center

    Puri, Avi

    2016-01-01

    The question in the title above is exploited to analyse the relationship between different astronomical models and frames of reference. The paper highlights the fact that the geostatic model, the favoured model in ordinary discourse, even that of scientists, is at odds with two cherished principles, that of the rectilinear propagation of light,…

  5. Geoscience research databases for coastal Alabama ecosystem management

    USGS Publications Warehouse

    Hummell, Richard L.

    1995-01-01

    Effective management of complex coastal ecosystems necessitates access to scientific knowledge that can be acquired through a multidisciplinary approach involving Federal and State scientists that take advantage of agency expertise and resources for the benefit of all participants working toward a set of common research and management goals. Cooperative geostatic investigations have led toward building databases of fundamental scientific knowledge that can be utilized to manage coastal Alabama's natural and future development. These databases have been used to assess the occurrence and economic potential of hard mineral resources in the Alabama EFZ, and to support oil spill contingency planning and environmental analysis for coastal Alabama.

  6. On safe ground? Analysis of European urban geohazards using satellite radar interferometry

    NASA Astrophysics Data System (ADS)

    Capes, Renalt; Teeuw, Richard

    2017-06-01

    Urban geological hazards involving ground instability can be costly, dangerous, and affect many people, yet there is little information about the extent or distribution of geohazards within Europe's urban areas. A reason for this is the impracticality of measuring ground instability associated with the many geohazard processes that are often hidden beneath buildings and are imperceptible to conventional geological survey detection techniques. Satellite radar interferometry, or InSAR, offers a remote sensing technique to map mm-scale ground deformation over wide areas given an archive of suitable multi-temporal data. The EC FP7 Space project named PanGeo (2011-2014), used InSAR to map areas of unstable ground in 52 of Europe's cities, representing ∼15% of the EU population. In partnership with Europe's national geological surveys, the PanGeo project developed a standardised geohazard-mapping methodology and recorded 1286 instances of 19 types of geohazard covering 18,000 km2. Presented here is an analysis of the results of the PanGeo-project output data, which provides insights into the distribution of European urban geohazards, their frequency and probability of occurrence. Merging PanGeo data with Eurostat's GeoStat data provides a systematic estimate of population exposures. Satellite radar interferometry is shown to be as a valuable tool for the systematic detection and mapping of urban geohazard phenomena.

  7. Geostatistical Approach to Estimating the Gold Ore Characteristics and Gold Reserves: A Case Study Daksa Area, Quang Nam Province, Viet Nam

    NASA Astrophysics Data System (ADS)

    Luan Truong, Xuan; Luong Le, Van; Quang Truong, Xuan

    2015-04-01

    Daksa gold deposit is the biggest gold deposits in Vietnam. The Daksa geological structure complicated, distributed mainly metamorphosed sedimentary NuiVu formation (PR3-?1nv2). The sulfide gold ore bodies distributed in quartz schist, quartz - biotite related to faut and distribution wing anticline. The gold ore bodies form circuits, network circuits, circuits lenses; fill the cup surface layer of the developing northeast - southwest; is the less than or west longitude north - SE. The results show that, Au and accompanying elements (Ag, Pb and Zn) have correlated pretty closely. All of its consistent with the logarithmic distribution standard, in accordance with the law of distribution of content mineral rare. The structure functions have nugget effect and spherical models with show that Au and accompanying elements special variation are changes. Au contents shown local anisotropy, no clearly anisotropy (K=1,17) and weakly anisotropy (K=1,4). Intensity mineralization of the ore bodies are quite high with demand spherical conversion coefficient ranging from 0.49 to 0.75 and from 0.66 to 0.97 (for other body). With nugget effects, ore bodies shown that it is consistent with mineralization in the ore bodies study, ore erasable, micro vein, infilling fractures in quartz vein. All of variogram presents local anisotropy, indicated gold mineralization at study area has least two-mineralization stages, consistent with the analysis of mineralography samples. By the results of the structure function study, the authors present the system optimization for exploration deposit and used to evaluate gold reserves by Ordinary Kriging. High accuracy of Kriging estimation results are expressed in the minimum Kriging variance, by compare the results calculated by some other methods (such as distance inverse weighting method, ..) and specially compare to the results of a some blocks have been exploited. Key words: Geostat and gold deposits VN. Daksa and gold mineralization. Geostat and gold mine Daksa.

  8. Creep stability of the proposed AIDA mission target 65803 Didymos: I. Discrete cohesionless granular physics model

    NASA Astrophysics Data System (ADS)

    Zhang, Yun; Richardson, Derek C.; Barnouin, Olivier S.; Maurel, Clara; Michel, Patrick; Schwartz, Stephen R.; Ballouz, Ronald-Louis; Benner, Lance A. M.; Naidu, Shantanu P.; Li, Junfeng

    2017-09-01

    As the target of the proposed Asteroid Impact & Deflection Assessment (AIDA) mission, the near-Earth binary asteroid 65803 Didymos represents a special class of binary asteroids, those whose primaries are at risk of rotational disruption. To gain a better understanding of these binary systems and to support the AIDA mission, this paper investigates the creep stability of the Didymos primary by representing it as a cohesionless self-gravitating granular aggregate subject to rotational acceleration. To achieve this goal, a soft-sphere discrete element model (SSDEM) capable of simulating granular systems in quasi-static states is implemented and a quasi-static spin-up procedure is carried out. We devise three critical spin limits for the simulated aggregates to indicate their critical states triggered by reshaping and surface shedding, internal structural deformation, and shear failure, respectively. The failure condition and mode, and shear strength of an aggregate can all be inferred from the three critical spin limits. The effects of arrangement and size distribution of constituent particles, bulk density, spin-up path, and interparticle friction are numerically explored. The results show that the shear strength of a spinning self-gravitating aggregate depends strongly on both its internal configuration and material parameters, while its failure mode and mechanism are mainly affected by its internal configuration. Additionally, this study provides some constraints on the possible physical properties of the Didymos primary based on observational data and proposes a plausible formation mechanism for this binary system. With a bulk density consistent with observational uncertainty and close to the maximum density allowed for the asteroid, the Didymos primary in certain configurations can remain geo-statically stable without requiring cohesion.

  9. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  10. The Next Generation in Subsidence and Aquifer-System Compaction Modeling within the MODFLOW Software Family: A New Package for MODFLOW-2005 and MODFLOW-OWHM

    NASA Astrophysics Data System (ADS)

    Boyce, S. E.; Leake, S. A.; Hanson, R. T.; Galloway, D. L.

    2015-12-01

    The Subsidence and Aquifer-System Compaction Packages, SUB and SUB-WT, for MODFLOW are two currently supported subsidence packages within the MODFLOW family of software. The SUB package allows the calculation of instantaneous and delayed releases of water from distributed interbeds (relatively more compressible fine-grained sediments) within a saturated aquifer system or discrete confining beds. The SUB-WT package does not include delayed releases, but does perform a more rigorous calculation of vertical stresses that can vary the effective stress that causes compaction. This calculation of instantaneous compaction can include the effect of water-table fluctuations for unconfined aquifers on effective stress, and can optionally adjust the elastic and inelastic storage properties based on the changes in effective stress. The next generation of subsidence modeling in MODFLOW is under development, and will merge and enhance the capabilities of the SUB and SUB-WT Packages for MODFLOW-2005 and MODFLOW-OWHM. This new version will also provide some additional features such as stress dependent vertical hydraulic conductivity of interbeds, time-varying geostatic loads, and additional attributes related to aquifer-system compaction and subsidence that will broaden the class of problems that can be simulated. The new version will include a redesigned source code, a new user friendly input file structure, more output options, and new subsidence solution options. This presentation will discuss progress in developing the new package and the new features being implemented and their potential applications. By Stanley Leake, Scott E. Boyce, Randall T. Hanson, and Devin Galloway

  11. Weapon Simulator Test Methodology Investigation: Comparison of Live Fire and Weapon Simulator Test Methodologies and the Effects of Clothing and Individual Equipment on Marksmanship

    DTIC Science & Technology

    2016-09-15

    METHODOLOGY INVESTIGATION: COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON...2. REPORT TYPE Final 3. DATES COVERED (From - To) October 2014 – August 2015 4. TITLE AND SUBTITLE WEAPON SIMULATOR TEST METHODOLOGY INVESTIGATION...COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON MARKSMANSHIP 5a. CONTRACT

  12. A methodology for the assessment of manned flight simulator fidelity

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; Malsbury, Terry N.

    1989-01-01

    A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.

  13. Expanding Simulations as a Means of Tactical Training with Multinational Partners

    DTIC Science & Technology

    2017-06-09

    gap through DOTMLPF in combination with an assessment of two case studies involving higher echelon use of simulations. Through this methodology , the...DOTMLPF in combination with an assessment of two case studies involving higher echelon use of simulations. Through this methodology , the findings...CHAPTER 3 RESEARCH METHODOLOGY .................................................................26 CHAPTER 4 ANALYSIS

  14. Research in Modeling and Simulation for Airspace Systems Innovation

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.; Kimmel, William M.; Welch, Sharon S.

    2007-01-01

    This viewgraph presentation provides an overview of some of the applied research and simulation methodologies at the NASA Langley Research Center that support aerospace systems innovation. Risk assessment methodologies, complex systems design and analysis methodologies, and aer ospace operations simulations are described. Potential areas for future research and collaboration using interactive and distributed simula tions are also proposed.

  15. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  16. Simulation Methodology in Nursing Education and Adult Learning Theory

    ERIC Educational Resources Information Center

    Rutherford-Hemming, Tonya

    2012-01-01

    Simulation is often used in nursing education as a teaching methodology. Simulation is rooted in adult learning theory. Three learning theories, cognitive, social, and constructivist, explain how learners gain knowledge with simulation experiences. This article takes an in-depth look at each of these three theories as each relates to simulation.…

  17. A methodological, task-based approach to Procedure-Specific Simulations training.

    PubMed

    Setty, Yaki; Salzman, Oren

    2016-12-01

    Procedure-Specific Simulations (PSS) are 3D realistic simulations that provide a platform to practice complete surgical procedures in a virtual-reality environment. While PSS have the potential to improve surgeons' proficiency, there are no existing standards or guidelines for PSS development in a structured manner. We employ a unique platform inspired by game design to develop virtual reality simulations in three dimensions of urethrovesical anastomosis during radical prostatectomy. 3D visualization is supported by a stereo vision, providing a fully realistic view of the simulation. The software can be executed for any robotic surgery platform. Specifically, we tested the simulation under windows environment on the RobotiX Mentor. Using urethrovesical anastomosis during radical prostatectomy simulation as a representative example, we present a task-based methodological approach to PSS training. The methodology provides tasks in increasing levels of difficulty from a novice level of basic anatomy identification, to an expert level that permits testing new surgical approaches. The modular methodology presented here can be easily extended to support more complex tasks. We foresee this methodology as a tool used to integrate PSS as a complementary training process for surgical procedures.

  18. The Effects of Time Advance Mechanism on Simple Agent Behaviors in Combat Simulations

    DTIC Science & Technology

    2011-12-01

    modeling packages that illustrate the differences between discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat... DES ) models , often referred to as “next-event” (Law and Kelton 2000) or discrete time simulation (DTS), commonly referred to as “time-step.” DTS...discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat models use DTS as their simulation time advance mechanism

  19. Using soft systems methodology to develop a simulation of out-patient services.

    PubMed

    Lehaney, B; Paul, R J

    1994-10-01

    Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.

  20. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  1. 75 FR 68392 - Self-Regulatory Organizations; The Options Clearing Corporation; Order Approving Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-05

    ... Numerical Simulations Risk Management Methodology November 1, 2010. I. Introduction On August 25, 2010, The... Analysis and Numerical Simulations (``STANS'') risk management methodology. The rule change alters... collateral within the STANS Monte Carlo simulations.\\7\\ \\7\\ OCC believes the approach currently used to...

  2. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  3. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  4. Probabilistic Simulation of Multi-Scale Composite Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  5. An Integrated Modeling and Simulation Methodology for Intelligent Systems Design and Testing

    DTIC Science & Technology

    2002-08-01

    simulation and actual execution. KEYWORDS: Model Continuity, Modeling, Simulation, Experimental Frame, Real Time Systems , Intelligent Systems...the methodology for a stand-alone real time system. Then it will scale up to distributed real time systems . For both systems, step-wise simulation...MODEL CONTINUITY Intelligent real time systems monitor, respond to, or control, an external environment. This environment is connected to the digital

  6. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis

    DTIC Science & Technology

    2016-06-01

    characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira

  7. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  8. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software

    PubMed Central

    Zuckerman, Daniel M.; Chong, Lillian T.

    2018-01-01

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling—the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes—protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation. PMID:28301772

  9. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.

    PubMed

    Zuckerman, Daniel M; Chong, Lillian T

    2017-05-22

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.

  10. Promoting quality and safety in women's health through the use of transdisciplinary clinical simulation educational modules: methodology and a pilot trial.

    PubMed

    Montgomery, Kymberlee; Morse, Catherine; Smith-Glasgow, Mary Ellen; Posmontier, Bobbie; Follen, Michele

    2012-02-01

    This manuscript presents the methodology used to assess the impact of a clinical simulation module used for training providers specializing in women's health. The methodology presented here will be used for a quantitative study in the future. Copyright © 2012 Elsevier HS Journals, Inc. All rights reserved.

  11. Training effectiveness assessment: Methodological problems and issues

    NASA Technical Reports Server (NTRS)

    Cross, Kenneth D.

    1992-01-01

    The U.S. military uses a large number of simulators to train and sustain the flying skills of helicopter pilots. Despite the enormous resources required to purchase, maintain, and use those simulators, little effort has been expended in assessing their training effectiveness. One reason for this is the lack of an evaluation methodology that yields comprehensive and valid data at a practical cost. Some of these methodological problems and issues that arise in assessing simulator training effectiveness, as well as problems with the classical transfer-of-learning paradigm were discussed.

  12. Probabilistic simulation of multi-scale composite behavior

    NASA Technical Reports Server (NTRS)

    Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.

    1993-01-01

    A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.

  13. The combination of simulation and response methodology and its application in an aggregate production plan

    NASA Astrophysics Data System (ADS)

    Chen, Zhiming; Feng, Yuncheng

    1988-08-01

    This paper describes an algorithmic structure for combining simulation and optimization techniques both in theory and practice. Response surface methodology is used to optimize the decision variables in the simulation environment. A simulation-optimization software has been developed and successfully implemented, and its application to an aggregate production planning simulation-optimization model is reported. The model's objective is to minimize the production cost and to generate an optimal production plan and inventory control strategy for an aircraft factory.

  14. American Society of Composites, 32nd Technical Conference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aitharaju, Venkat; Wollschlager, Jeffrey; Plakomytis2, Dimitrios

    This paper will present a general methodology by which weave draping manufacturing simulation results can be utilized to include the effects of weave draping and scissor angle in a structural multiscale simulation. While the methodology developed is general in nature, this paper will specifically demonstrate the methodology applied to a truncated pyramid, utilizing manufacturing simulation weave draping results from ESI PAM-FORM, and multiscale simulation using Altair Multiscale Designer (MDS) and OptiStruct. From a multiscale simulation perspective, the weave draping manufacturing simulation results will be used to develop a series of woven unit cells which cover the range of weave scissormore » angles existing within the part. For each unit cell, a multiscale material model will be developed, and applied to the corresponding spatial locations within the structural simulation mesh. In addition, the principal material orientation will be mapped from the wave draping manufacturing simulation mesh to the structural simulation mesh using Altair HyperMesh mapping technology. Results of the coupled simulation will be compared and verified against experimental data of the same available via General Motors (GM) Department of Energy (DOE) project.« less

  15. Methodology for turbulence code validation: Quantification of simulation-experiment agreement and application to the TORPEX experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, Paolo; Theiler, C.; Fasoli, A.

    A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less

  16. Methodology for testing infrared focal plane arrays in simulated nuclear radiation environments

    NASA Astrophysics Data System (ADS)

    Divita, E. L.; Mills, R. E.; Koch, T. L.; Gordon, M. J.; Wilcox, R. A.; Williams, R. E.

    1992-07-01

    This paper summarizes test methodology for focal plane array (FPA) testing that can be used for benign (clear) and radiation environments, and describes the use of custom dewars and integrated test equipment in an example environment. The test methodology, consistent with American Society for Testing Materials (ASTM) standards, is presented for the total accumulated gamma dose, transient dose rate, gamma flux, and neutron fluence environments. The merits and limitations of using Cobalt 60 for gamma environment simulations and of using various fast-neutron reactors and neutron sources for neutron simulations are presented. Test result examples are presented to demonstrate test data acquisition and FPA parameter performance under different measurement conditions and environmental simulations.

  17. Motion Cues in Flight Simulation and Simulator Induced Sickness

    DTIC Science & Technology

    1988-06-01

    asseusod in a driving simulator by means of a response surface methodology central-composite design . The most salient finding of the study was that visual...across treatment conditions. For an orthogonal response surface methodology (IBM) design with only tro independent variables. it can be readily shown that...J.E.Fowikes 8 SESSION III - ETIOLOGICAL FACTORS IN SIMULATOR-INDUCED AFTER EFFETS THE USE OF VE& IIBULAR MODELS FOR DESIGN AND EVALUATION OF FLIGHT

  18. Crash Simulation and Animation: 'A New Approach for Traffic Safety Analysis'

    DOT National Transportation Integrated Search

    2001-02-01

    This researchs objective is to present a methodology to supplement the conventional traffic safety analysis techniques. This methodology aims at using computer simulation to animate and visualize crash occurrence at high-risk locations. This methodol...

  19. INTEGRATING DATA ANALYTICS AND SIMULATION METHODS TO SUPPORT MANUFACTURING DECISION MAKING

    PubMed Central

    Kibira, Deogratias; Hatim, Qais; Kumara, Soundar; Shao, Guodong

    2017-01-01

    Modern manufacturing systems are installed with smart devices such as sensors that monitor system performance and collect data to manage uncertainties in their operations. However, multiple parameters and variables affect system performance, making it impossible for a human to make informed decisions without systematic methodologies and tools. Further, the large volume and variety of streaming data collected is beyond simulation analysis alone. Simulation models are run with well-prepared data. Novel approaches, combining different methods, are needed to use this data for making guided decisions. This paper proposes a methodology whereby parameters that most affect system performance are extracted from the data using data analytics methods. These parameters are used to develop scenarios for simulation inputs; system optimizations are performed on simulation data outputs. A case study of a machine shop demonstrates the proposed methodology. This paper also reviews candidate standards for data collection, simulation, and systems interfaces. PMID:28690363

  20. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less

  1. Performance evaluation in full-mission simulation - Methodological advances and research challenges. [in air transport operations

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Kanki, Barbara G.; Helmreich, Robert L.

    1989-01-01

    The crew-factors research program at NASA Ames has developed a methodology for studying the impact of a variety of variables on the effectiveness of crews flying realistic but high workload simulated trips. The validity of investigations using the methodology is enhanced by careful design of full-mission scenarios, performance assessment using converging sources of data, and recruitment of representative subjects. Recently, portions of this methodology have been adapted for use in assessing the effectiveness of crew coordination among participants in line-oriented flight training.

  2. An Initial Multi-Domain Modeling of an Actively Cooled Structure

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur

    1997-01-01

    A methodology for the simulation of turbine cooling flows is being developed. The methodology seeks to combine numerical techniques that optimize both accuracy and computational efficiency. Key components of the methodology include the use of multiblock grid systems for modeling complex geometries, and multigrid convergence acceleration for enhancing computational efficiency in highly resolved fluid flow simulations. The use of the methodology has been demonstrated in several turbo machinery flow and heat transfer studies. Ongoing and future work involves implementing additional turbulence models, improving computational efficiency, adding AMR.

  3. Development and Application of a Clinical Microsystem Simulation Methodology for Human Factors-Based Research of Alarm Fatigue.

    PubMed

    Kobayashi, Leo; Gosbee, John W; Merck, Derek L

    2017-07-01

    (1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.

  4. A Systematic Determination of Skill and Simulator Requirements for Airplane Pilot Certification

    DOT National Transportation Integrated Search

    1985-03-01

    This research report describes: (1) the FAA's ATP airman certification system; (2) needs of the system regarding simulator use; (3) a systematic methodology for meeting these needs; (4) application of the methodology; (5) results of the study; and (6...

  5. Cosmological Conundrums and Discoveries Since Newton

    NASA Astrophysics Data System (ADS)

    Topper, David R.

    Cosmology is key branch of astronomy, dealing with questions around the structure of the universe. The ancient cosmos - systematically codified by Aristotle, and later given empirical support, especially by Ptolemy - was geocentric, geostatic, and finite. Based on a common sense view of the world being as it appears to our senses, the ancient model prevailed well into the seventeenth century. The subsequent scientific revolution, however, bequeathed to the eighteenth century and after a radically different cosmic model. The radical change came in two stages. First Copernicus in the fifteenth century moved the Sun to Earth's previous place at the center of the universe, an idea adopted by Galileo, Kepler, and a few other key thinkers up to Newton. The second stage, often called the "breaking of the sphere," replaced the sphere of a few thousand stars at the edge of the finite universe with myriad stars extending into an infinite universe, filled with Newton's invisible gravity, and with our Earth being the third planet from the Sun in our solar system somewhere within that Euclidean space. Two planets were added to our solar system (one in the eighteenth and one in the nineteenth centuries), but the overall structure remained essentially as conceived by Newton when he died in 1727. This was the universe Einstein was born into in 1879.

  6. Simulation of Attacks for Security in Wireless Sensor Network.

    PubMed

    Diaz, Alvaro; Sanchez, Pablo

    2016-11-18

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.

  7. Local deformation for soft tissue simulation

    PubMed Central

    Omar, Nadzeri; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2016-01-01

    ABSTRACT This paper presents a new methodology to localize the deformation range to improve the computational efficiency for soft tissue simulation. This methodology identifies the local deformation range from the stress distribution in soft tissues due to an external force. A stress estimation method is used based on elastic theory to estimate the stress in soft tissues according to a depth from the contact surface. The proposed methodology can be used with both mass-spring and finite element modeling approaches for soft tissue deformation. Experimental results show that the proposed methodology can improve the computational efficiency while maintaining the modeling realism. PMID:27286482

  8. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  9. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  10. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators.

    PubMed

    Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego

    2017-09-22

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.

  11. A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators

    PubMed Central

    Sánchez-Picot, Álvaro

    2017-01-01

    Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal. PMID:28937610

  12. Methodology development for evaluation of selective-fidelity rotorcraft simulation

    NASA Technical Reports Server (NTRS)

    Lewis, William D.; Schrage, D. P.; Prasad, J. V. R.; Wolfe, Daniel

    1992-01-01

    This paper addressed the initial step toward the goal of establishing performance and handling qualities acceptance criteria for realtime rotorcraft simulators through a planned research effort to quantify the system capabilities of 'selective fidelity' simulators. Within this framework the simulator is then classified based on the required task. The simulator is evaluated by separating the various subsystems (visual, motion, etc.) and applying corresponding fidelity constants based on the specific task. This methodology not only provides an assessment technique, but also provides a technique to determine the required levels of subsystem fidelity for a specific task.

  13. Incorporating scenario-based simulation into a hospital nursing education program.

    PubMed

    Nagle, Beth M; McHale, Jeanne M; Alexander, Gail A; French, Brian M

    2009-01-01

    Nurse educators are challenged to provide meaningful and effective learning opportunities for both new and experienced nurses. Simulation as a teaching and learning methodology is being embraced by nursing in academic and practice settings to provide innovative educational experiences to assess and develop clinical competency, promote teamwork, and improve care processes. This article provides an overview of the historical basis for using simulation in education, simulation methodologies, and perceived advantages and disadvantages. It also provides a description of the integration of scenario-based programs using a full-scale patient simulator into nursing education programming at a large academic medical center.

  14. VERA Core Simulator Methodology for PWR Cycle Depletion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kochunas, Brendan; Collins, Benjamin S; Jabaay, Daniel

    2015-01-01

    This paper describes the methodology developed and implemented in MPACT for performing high-fidelity pressurized water reactor (PWR) multi-cycle core physics calculations. MPACT is being developed primarily for application within the Consortium for the Advanced Simulation of Light Water Reactors (CASL) as one of the main components of the VERA Core Simulator, the others being COBRA-TF and ORIGEN. The methods summarized in this paper include a methodology for performing resonance self-shielding and computing macroscopic cross sections, 2-D/1-D transport, nuclide depletion, thermal-hydraulic feedback, and other supporting methods. These methods represent a minimal set needed to simulate high-fidelity models of a realistic nuclearmore » reactor. Results demonstrating this are presented from the simulation of a realistic model of the first cycle of Watts Bar Unit 1. The simulation, which approximates the cycle operation, is observed to be within 50 ppm boron (ppmB) reactivity for all simulated points in the cycle and approximately 15 ppmB for a consistent statepoint. The verification and validation of the PWR cycle depletion capability in MPACT is the focus of two companion papers.« less

  15. Simulation of Attacks for Security in Wireless Sensor Network

    PubMed Central

    Diaz, Alvaro; Sanchez, Pablo

    2016-01-01

    The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710

  16. Methodologies for extracting kinetic constants for multiphase reacting flow simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, S.L.; Lottes, S.A.; Golchert, B.

    1997-03-01

    Flows in industrial reactors often involve complex reactions of many species. A computational fluid dynamics (CFD) computer code, ICRKFLO, was developed to simulate multiphase, multi-species reacting flows. The ICRKFLO uses a hybrid technique to calculate species concentration and reaction for a large number of species in a reacting flow. This technique includes a hydrodynamic and reacting flow simulation with a small but sufficient number of lumped reactions to compute flow field properties followed by a calculation of local reaction kinetics and transport of many subspecies (order of 10 to 100). Kinetic rate constants of the numerous subspecies chemical reactions aremore » difficult to determine. A methodology has been developed to extract kinetic constants from experimental data efficiently. A flow simulation of a fluid catalytic cracking (FCC) riser was successfully used to demonstrate this methodology.« less

  17. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    NASA Technical Reports Server (NTRS)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  18. Automation method to identify the geological structure of seabed using spatial statistic analysis of echo sounding data

    NASA Astrophysics Data System (ADS)

    Kwon, O.; Kim, W.; Kim, J.

    2017-12-01

    Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)

  19. Multiphysics Simulation of Welding-Arc and Nozzle-Arc System: Mathematical-Model, Solution-Methodology and Validation

    NASA Astrophysics Data System (ADS)

    Pawar, Sumedh; Sharma, Atul

    2018-01-01

    This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.

  20. Predicting Failure Progression and Failure Loads in Composite Open-Hole Tension Coupons

    NASA Technical Reports Server (NTRS)

    Arunkumar, Satyanarayana; Przekop, Adam

    2010-01-01

    Failure types and failure loads in carbon-epoxy [45n/90n/-45n/0n]ms laminate coupons with central circular holes subjected to tensile load are simulated using progressive failure analysis (PFA) methodology. The progressive failure methodology is implemented using VUMAT subroutine within the ABAQUS(TradeMark)/Explicit nonlinear finite element code. The degradation model adopted in the present PFA methodology uses an instantaneous complete stress reduction (COSTR) approach to simulate damage at a material point when failure occurs. In-plane modeling parameters such as element size and shape are held constant in the finite element models, irrespective of laminate thickness and hole size, to predict failure loads and failure progression. Comparison to published test data indicates that this methodology accurately simulates brittle, pull-out and delamination failure types. The sensitivity of the failure progression and the failure load to analytical loading rates and solvers precision is demonstrated.

  1. Optimization of lamp arrangement in a closed-conduit UV reactor based on a genetic algorithm.

    PubMed

    Sultan, Tipu; Ahmad, Zeshan; Cho, Jinsoo

    2016-01-01

    The choice for the arrangement of the UV lamps in a closed-conduit ultraviolet (CCUV) reactor significantly affects the performance. However, a systematic methodology for the optimal lamp arrangement within the chamber of the CCUV reactor is not well established in the literature. In this research work, we propose a viable systematic methodology for the lamp arrangement based on a genetic algorithm (GA). In addition, we analyze the impacts of the diameter, angle, and symmetry of the lamp arrangement on the reduction equivalent dose (RED). The results are compared based on the simulated RED values and evaluated using the computational fluid dynamics simulations software ANSYS FLUENT. The fluence rate was calculated using commercial software UVCalc3D, and the GA-based lamp arrangement optimization was achieved using MATLAB. The simulation results provide detailed information about the GA-based methodology for the lamp arrangement, the pathogen transport, and the simulated RED values. A significant increase in the RED values was achieved by using the GA-based lamp arrangement methodology. This increase in RED value was highest for the asymmetric lamp arrangement within the chamber of the CCUV reactor. These results demonstrate that the proposed GA-based methodology for symmetric and asymmetric lamp arrangement provides a viable technical solution to the design and optimization of the CCUV reactor.

  2. A top-down design methodology and its implementation for VCSEL-based optical links design

    NASA Astrophysics Data System (ADS)

    Li, Jiguang; Cao, Mingcui; Cai, Zilong

    2005-01-01

    In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.

  3. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  4. A Monte Carlo analysis of breast screening randomized trials.

    PubMed

    Zamora, Luis I; Forastero, Cristina; Guirado, Damián; Lallena, Antonio M

    2016-12-01

    To analyze breast screening randomized trials with a Monte Carlo simulation tool. A simulation tool previously developed to simulate breast screening programmes was adapted for that purpose. The history of women participating in the trials was simulated, including a model for survival after local treatment of invasive cancers. Distributions of time gained due to screening detection against symptomatic detection and the overall screening sensitivity were used as inputs. Several randomized controlled trials were simulated. Except for the age range of women involved, all simulations used the same population characteristics and this permitted to analyze their external validity. The relative risks obtained were compared to those quoted for the trials, whose internal validity was addressed by further investigating the reasons of the disagreements observed. The Monte Carlo simulations produce results that are in good agreement with most of the randomized trials analyzed, thus indicating their methodological quality and external validity. A reduction of the breast cancer mortality around 20% appears to be a reasonable value according to the results of the trials that are methodologically correct. Discrepancies observed with Canada I and II trials may be attributed to a low mammography quality and some methodological problems. Kopparberg trial appears to show a low methodological quality. Monte Carlo simulations are a powerful tool to investigate breast screening controlled randomized trials, helping to establish those whose results are reliable enough to be extrapolated to other populations and to design the trial strategies and, eventually, adapting them during their development. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. On Designing Multicore-Aware Simulators for Systems Biology Endowed with OnLine Statistics

    PubMed Central

    Calcagno, Cristina; Coppo, Mario

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed. PMID:25050327

  6. On designing multicore-aware simulators for systems biology endowed with OnLine statistics.

    PubMed

    Aldinucci, Marco; Calcagno, Cristina; Coppo, Mario; Damiani, Ferruccio; Drocco, Maurizio; Sciacca, Eva; Spinella, Salvatore; Torquati, Massimo; Troina, Angelo

    2014-01-01

    The paper arguments are on enabling methodologies for the design of a fully parallel, online, interactive tool aiming to support the bioinformatics scientists .In particular, the features of these methodologies, supported by the FastFlow parallel programming framework, are shown on a simulation tool to perform the modeling, the tuning, and the sensitivity analysis of stochastic biological models. A stochastic simulation needs thousands of independent simulation trajectories turning into big data that should be analysed by statistic and data mining tools. In the considered approach the two stages are pipelined in such a way that the simulation stage streams out the partial results of all simulation trajectories to the analysis stage that immediately produces a partial result. The simulation-analysis workflow is validated for performance and effectiveness of the online analysis in capturing biological systems behavior on a multicore platform and representative proof-of-concept biological systems. The exploited methodologies include pattern-based parallel programming and data streaming that provide key features to the software designers such as performance portability and efficient in-memory (big) data management and movement. Two paradigmatic classes of biological systems exhibiting multistable and oscillatory behavior are used as a testbed.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dawson, Paul R.; Boyce, Donald E.; Park, Jun-Sang

    A robust methodology is presented to extract slip system strengths from lattice strain distributions for polycrystalline samples obtained from high-energy x-ray diffraction (HEXD) experiments with in situ loading. The methodology consists of matching the evolution of coefficients of a harmonic expansion of the distributions from simulation to the coefficients derived from measurements. Simulation results are generated via finite element simulations of virtual polycrystals that are subjected to the loading history applied in the HEXD experiments. Advantages of the methodology include: (1) its ability to utilize extensive data sets generated by HEXD experiments; (2) its ability to capture trends in distributionsmore » that may be noisy (both measured and simulated); and (3) its sensitivity to the ratios of the family strengths. The approach is used to evaluate the slip system strengths of Ti-6Al-4V using samples having relatively equiaxed grains. These strength estimates are compared to values in the literature.« less

  8. Soft tissue modelling through autowaves for surgery simulation.

    PubMed

    Zhong, Yongmin; Shirinzadeh, Bijan; Alici, Gursel; Smith, Julian

    2006-09-01

    Modelling of soft tissue deformation is of great importance to virtual reality based surgery simulation. This paper presents a new methodology for simulation of soft tissue deformation by drawing an analogy between autowaves and soft tissue deformation. The potential energy stored in a soft tissue as a result of a deformation caused by an external force is propagated among mass points of the soft tissue by non-linear autowaves. The novelty of the methodology is that (i) autowave techniques are established to describe the potential energy distribution of a deformation for extrapolating internal forces, and (ii) non-linear materials are modelled with non-linear autowaves other than geometric non-linearity. Integration with a haptic device has been achieved to simulate soft tissue deformation with force feedback. The proposed methodology not only deals with large-range deformations, but also accommodates isotropic, anisotropic and inhomogeneous materials by simply changing diffusion coefficients.

  9. Current target acquisition methodology in force on force simulations

    NASA Astrophysics Data System (ADS)

    Hixson, Jonathan G.; Miller, Brian; Mazz, John P.

    2017-05-01

    The U.S. Army RDECOM CERDEC NVESD MSD's target acquisition models have been used for many years by the military community in force on force simulations for training, testing, and analysis. There have been significant improvements to these models over the past few years. The significant improvements are the transition of ACQUIRE TTP-TAS (ACQUIRE Targeting Task Performance Target Angular Size) methodology for all imaging sensors and the development of new discrimination criteria for urban environments and humans. This paper is intended to provide an overview of the current target acquisition modeling approach and provide data for the new discrimination tasks. This paper will discuss advances and changes to the models and methodologies used to: (1) design and compare sensors' performance, (2) predict expected target acquisition performance in the field, (3) predict target acquisition performance for combat simulations, and (4) how to conduct model data validation for combat simulations.

  10. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 2: numerical application

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.

  11. Passenger rail vehicle safety assessment methodology. Volume I, Summary of safe performance limits.

    DOT National Transportation Integrated Search

    2000-04-01

    This report presents a methodology based on computer simulation that asseses the safe dyamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical design parameters and characteristic properties of bo...

  12. Design Of Combined Stochastic Feedforward/Feedback Control

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1989-01-01

    Methodology accommodates variety of control structures and design techniques. In methodology for combined stochastic feedforward/feedback control, main objectives of feedforward and feedback control laws seen clearly. Inclusion of error-integral feedback, dynamic compensation, rate-command control structure, and like integral element of methodology. Another advantage of methodology flexibility to develop variety of techniques for design of feedback control with arbitrary structures to obtain feedback controller: includes stochastic output feedback, multiconfiguration control, decentralized control, or frequency and classical control methods. Control modes of system include capture and tracking of localizer and glideslope, crab, decrab, and flare. By use of recommended incremental implementation, control laws simulated on digital computer and connected with nonlinear digital simulation of aircraft and its systems.

  13. Development of Constraint Force Equation Methodology for Application to Multi-Body Dynamics Including Launch Vehicle Stage Seperation

    NASA Technical Reports Server (NTRS)

    Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.

    2016-01-01

    The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.

  14. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  15. Evaluation and modeling of autonomous attitude thrust control for the Geostation Operational Environmental Satellite (GOES)-8 orbit determination

    NASA Technical Reports Server (NTRS)

    Forcey, W.; Minnie, C. R.; Defazio, R. L.

    1995-01-01

    The Geostationary Operational Environmental Satellite (GOES)-8 experienced a series of orbital perturbations from autonomous attitude control thrusting before perigee raising maneuvers. These perturbations influenced differential correction orbital state solutions determined by the Goddard Space Flight Center (GSFC) Goddard Trajectory Determination System (GTDS). The maneuvers induced significant variations in the converged state vector for solutions using increasingly longer tracking data spans. These solutions were used for planning perigee maneuvers as well as initial estimates for orbit solutions used to evaluate the effectiveness of the perigee raising maneuvers. This paper discusses models for the incorporation of attitude thrust effects into the orbit determination process. Results from definitive attitude solutions are modeled as impulsive thrusts in orbit determination solutions created for GOES-8 mission support. Due to the attitude orientation of GOES-8, analysis results are presented that attempt to absorb the effects of attitude thrusting by including a solution for the coefficient of reflectivity, C(R). Models to represent the attitude maneuvers are tested against orbit determination solutions generated during real-time support of the GOES-8 mission. The modeling techniques discussed in this investigation offer benefits to the remaining missions in the GOES NEXT series. Similar missions with large autonomous attitude control thrusting, such as the Solar and Heliospheric Observatory (SOHO) spacecraft and the INTELSAT series, may also benefit from these results.

  16. Project Argo: The design and analysis of an all-propulsive and an aeroassisted version of a manned space transportation vehicle

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Project Argo is the design of a manned Space Transportation Vehicle (STV) that would transport payloads between LEO (altitude lying between 278 to 500 km above the Earth) and GEO (altitude is approximately 35,800 km above the Earth) and would be refueled and refurbished at the Space Station Freedom. Argo would be man's first space-based manned vehicle and would provide a crucial link to geosynchronous orbit where the vast majority of satellites are located. The vehicle could be built and launched shortly after the space station and give invaluable space experience while serving as a workhorse to deliver and repair satellites. Eventually, if a manned space station is established in GEO, then Argo could serve as the transport between the Space Station Freedom and a Geostation. If necessary, modifications could be made to allow the vehicle to reach the moon or possibly Mars. Project Argo is unique in that it consists of the design and comparison of two different concepts to accomplish the same mission. The first is an all-propulsive vehicle which uses chemical propulsion for all of its major maneuvers between LEO and GEO. The second is a vehicle that uses aeroassisted braking during its return from GEO to LEO by passing through the upper portions of the atmosphere.

  17. Solution to the indexing problem of frequency domain simulation experiments

    NASA Technical Reports Server (NTRS)

    Mitra, Mousumi; Park, Stephen K.

    1991-01-01

    A frequency domain simulation experiment is one in which selected system parameters are oscillated sinusoidally to induce oscillations in one or more system statistics of interest. A spectral (Fourier) analysis of these induced oscillations is then performed. To perform this spectral analysis, all oscillation frequencies must be referenced to a common, independent variable - an oscillation index. In a discrete-event simulation, the global simulation clock is the most natural choice for the oscillation index. However, past efforts to reference all frequencies to the simulation clock generally yielded unsatisfactory results. The reason for these unsatisfactory results is explained in this paper and a new methodology which uses the simulation clock as the oscillation index is presented. Techniques for implementing this new methodology are demonstrated by performing a frequency domain simulation experiment for a network of queues.

  18. System Dynamics Modeling for Proactive Intelligence

    DTIC Science & Technology

    2010-01-01

    5  4. Modeling Resources as Part of an Integrated Multi- Methodology System .................. 16  5. Formalizing Pro-Active...Observable Data With and Without Simulation Analysis ............................... 15  Figure 13. Summary of Probe Methodology and Results...Strategy ............................................................................. 22  Figure 22. Overview of Methodology

  19. Radioactive waste disposal fees-Methodology for calculation

    NASA Astrophysics Data System (ADS)

    Bemš, Július; Králík, Tomáš; Kubančák, Ján; Vašíček, Jiří; Starý, Oldřich

    2014-11-01

    This paper summarizes the methodological approach used for calculation of fee for low- and intermediate-level radioactive waste disposal and for spent fuel disposal. The methodology itself is based on simulation of cash flows related to the operation of system for waste disposal. The paper includes demonstration of methodology application on the conditions of the Czech Republic.

  20. Automated procedure for developing hybrid computer simulations of turbofan engines. Part 1: General description

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Krosel, S. M.; Bruton, W. M.

    1982-01-01

    A systematic, computer-aided, self-documenting methodology for developing hybrid computer simulations of turbofan engines is presented. The methodology that is pesented makes use of a host program that can run on a large digital computer and a machine-dependent target (hybrid) program. The host program performs all the calculations and data manipulations that are needed to transform user-supplied engine design information to a form suitable for the hybrid computer. The host program also trims the self-contained engine model to match specified design-point information. Part I contains a general discussion of the methodology, describes a test case, and presents comparisons between hybrid simulation and specified engine performance data. Part II, a companion document, contains documentation, in the form of computer printouts, for the test case.

  1. Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur; Modiano, David

    1995-01-01

    Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.

  2. SIMSAT: An object oriented architecture for real-time satellite simulation

    NASA Technical Reports Server (NTRS)

    Williams, Adam P.

    1993-01-01

    Real-time satellite simulators are vital tools in the support of satellite missions. They are used in the testing of ground control systems, the training of operators, the validation of operational procedures, and the development of contingency plans. The simulators must provide high-fidelity modeling of the satellite, which requires detailed system information, much of which is not available until relatively near launch. The short time-scales and resulting high productivity required of such simulator developments culminates in the need for a reusable infrastructure which can be used as a basis for each simulator. This paper describes a major new simulation infrastructure package, the Software Infrastructure for Modelling Satellites (SIMSAT). It outlines the object oriented design methodology used, describes the resulting design, and discusses the advantages and disadvantages experienced in applying the methodology.

  3. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    NASA Astrophysics Data System (ADS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-05-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic.

  4. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-05-08

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. Thismore » work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic.« less

  5. Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.

    1994-01-01

    This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.

  6. Some Dimensions of Simulation.

    ERIC Educational Resources Information Center

    Beck, Isabel; Monroe, Bruce

    Beginning with definitions of "simulation" (a methodology for testing alternative decisions under hypothetical conditions), this paper focuses on the use of simulation as an instructional method, pointing out the relationships and differences between role playing, games, and simulation. The term "simulation games" is explored with an analysis of…

  7. Theory of mind and Verstehen (understanding) methodology.

    PubMed

    Kumazaki, Tsutomu

    2016-09-01

    Theory of mind is a prominent, but highly controversial, field in psychology, psychiatry, and philosophy of mind. Simulation theory, theory-theory and other views have been presented in recent decades, none of which are monolithic. In this article, various views on theory of mind are reviewed, and methodological problems within each view are investigated. The relationship between simulation theory and Verstehen (understanding) methodology in traditional human sciences is an intriguing issue, although the latter is not a direct ancestor of the former. From that perspective, lessons for current clinical psychiatry are drawn. © The Author(s) 2016.

  8. Application of CFE/POST2 for Simulation of Launch Vehicle Stage Separation

    NASA Technical Reports Server (NTRS)

    Pamadi, Bandu N.; Tartabini, Paul V.; Toniolo, Matthew D.; Roithmayr, Carlos M.; Karlgaard, Christopher D.; Samareh, Jamshid A.

    2009-01-01

    The constraint force equation (CFE) methodology provides a framework for modeling constraint forces and moments acting at joints that connect multiple vehicles. With implementation in Program to Optimize Simulated Trajectories II (POST 2), the CFE provides a capability to simulate end-to-end trajectories of launch vehicles, including stage separation. In this paper, the CFE/POST2 methodology is applied to the Shuttle-SRB separation problem as a test and validation case. The CFE/POST2 results are compared with STS-1 flight test data.

  9. Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.

    PubMed

    Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen

    2015-11-01

    Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.

  10. Methodological approach to simulation and choice of ecologically efficient and energetically economic wind turbines (WT)

    NASA Astrophysics Data System (ADS)

    Bespalov, Vadim; Udina, Natalya; Samarskaya, Natalya

    2017-10-01

    Use of wind energy is related to one of the prospective directions among renewed energy sources. A methodological approach is reviewed in the article to simulation and choice of ecologically efficient and energetically economic wind turbines on the designing stage taking into account characteristics of natural-territorial complex and peculiarities of anthropogenic load in the territory of WT location.

  11. Explosion/Blast Dynamics for Constellation Launch Vehicles Assessment

    NASA Technical Reports Server (NTRS)

    Baer, Mel; Crawford, Dave; Hickox, Charles; Kipp, Marlin; Hertel, Gene; Morgan, Hal; Ratzel, Arthur; Cragg, Clinton H.

    2009-01-01

    An assessment methodology is developed to guide quantitative predictions of adverse physical environments and the subsequent effects on the Ares-1 crew launch vehicle associated with the loss of containment of cryogenic liquid propellants from the upper stage during ascent. Development of the methodology is led by a team at Sandia National Laboratories (SNL) with guidance and support from a number of National Aeronautics and Space Administration (NASA) personnel. The methodology is based on the current Ares-1 design and feasible accident scenarios. These scenarios address containment failure from debris impact or structural response to pressure or blast loading from an external source. Once containment is breached, the envisioned assessment methodology includes predictions for the sequence of physical processes stemming from cryogenic tank failure. The investigative techniques, analysis paths, and numerical simulations that comprise the proposed methodology are summarized and appropriate simulation software is identified in this report.

  12. Determining the strengths of HCP slip systems using harmonic analyses of lattice strain distributions

    DOE PAGES

    Dawson, Paul R.; Boyce, Donald E.; Park, Jun-Sang; ...

    2017-10-15

    A robust methodology is presented to extract slip system strengths from lattice strain distributions for polycrystalline samples obtained from high-energy x-ray diffraction (HEXD) experiments with in situ loading. The methodology consists of matching the evolution of coefficients of a harmonic expansion of the distributions from simulation to the coefficients derived from measurements. Simulation results are generated via finite element simulations of virtual polycrystals that are subjected to the loading history applied in the HEXD experiments. Advantages of the methodology include: (1) its ability to utilize extensive data sets generated by HEXD experiments; (2) its ability to capture trends in distributionsmore » that may be noisy (both measured and simulated); and (3) its sensitivity to the ratios of the family strengths. The approach is used to evaluate the slip system strengths of Ti-6Al-4V using samples having relatively equiaxed grains. These strength estimates are compared to values in the literature.« less

  13. Recent advances in computational methodology for simulation of mechanical circulatory assist devices

    PubMed Central

    Marsden, Alison L.; Bazilevs, Yuri; Long, Christopher C.; Behr, Marek

    2014-01-01

    Ventricular assist devices (VADs) provide mechanical circulatory support to offload the work of one or both ventricles during heart failure. They are used in the clinical setting as destination therapy, as bridge to transplant, or more recently as bridge to recovery to allow for myocardial remodeling. Recent developments in computational simulation allow for detailed assessment of VAD hemodynamics for device design and optimization for both children and adults. Here, we provide a focused review of the recent literature on finite element methods and optimization for VAD simulations. As VAD designs typically fall into two categories, pulsatile and continuous flow devices, we separately address computational challenges of both types of designs, and the interaction with the circulatory system with three representative case studies. In particular, we focus on recent advancements in finite element methodology that has increased the fidelity of VAD simulations. We outline key challenges, which extend to the incorporation of biological response such as thrombosis and hemolysis, as well as shape optimization methods and challenges in computational methodology. PMID:24449607

  14. A methodology for thermodynamic simulation of high temperature, internal reforming fuel cell systems

    NASA Astrophysics Data System (ADS)

    Matelli, José Alexandre; Bazzo, Edson

    This work presents a methodology for simulation of fuel cells to be used in power production in small on-site power/cogeneration plants that use natural gas as fuel. The methodology contemplates thermodynamics and electrochemical aspects related to molten carbonate and solid oxide fuel cells (MCFC and SOFC, respectively). Internal steam reforming of the natural gas hydrocarbons is considered for hydrogen production. From inputs as cell potential, cell power, number of cell in the stack, ancillary systems power consumption, reformed natural gas composition and hydrogen utilization factor, the simulation gives the natural gas consumption, anode and cathode stream gases temperature and composition, and thermodynamic, electrochemical and practical efficiencies. Both energetic and exergetic methods are considered for performance analysis. The results obtained from natural gas reforming thermodynamics simulation show that the hydrogen production is maximum around 700 °C, for a steam/carbon ratio equal to 3. As shown in the literature, the found results indicate that the SOFC is more efficient than MCFC.

  15. Euler-euler anisotropic gaussian mesoscale simulation of homogeneous cluster-induced gas-particle turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kong, Bo; Fox, Rodney O.; Feng, Heng

    An Euler–Euler anisotropic Gaussian approach (EE-AG) for simulating gas–particle flows, in which particle velocities are assumed to follow a multivariate anisotropic Gaussian distribution, is used to perform mesoscale simulations of homogeneous cluster-induced turbulence (CIT). A three-dimensional Gauss–Hermite quadrature formulation is used to calculate the kinetic flux for 10 velocity moments in a finite-volume framework. The particle-phase volume-fraction and momentum equations are coupled with the Eulerian solver for the gas phase. This approach is implemented in an open-source CFD package, OpenFOAM, and detailed simulation results are compared with previous Euler–Lagrange simulations in a domain size study of CIT. Here, these resultsmore » demonstrate that the proposed EE-AG methodology is able to produce comparable results to EL simulations, and this moment-based methodology can be used to perform accurate mesoscale simulations of dilute gas–particle flows.« less

  16. Euler-euler anisotropic gaussian mesoscale simulation of homogeneous cluster-induced gas-particle turbulence

    DOE PAGES

    Kong, Bo; Fox, Rodney O.; Feng, Heng; ...

    2017-02-16

    An Euler–Euler anisotropic Gaussian approach (EE-AG) for simulating gas–particle flows, in which particle velocities are assumed to follow a multivariate anisotropic Gaussian distribution, is used to perform mesoscale simulations of homogeneous cluster-induced turbulence (CIT). A three-dimensional Gauss–Hermite quadrature formulation is used to calculate the kinetic flux for 10 velocity moments in a finite-volume framework. The particle-phase volume-fraction and momentum equations are coupled with the Eulerian solver for the gas phase. This approach is implemented in an open-source CFD package, OpenFOAM, and detailed simulation results are compared with previous Euler–Lagrange simulations in a domain size study of CIT. Here, these resultsmore » demonstrate that the proposed EE-AG methodology is able to produce comparable results to EL simulations, and this moment-based methodology can be used to perform accurate mesoscale simulations of dilute gas–particle flows.« less

  17. pysimm: A Python Package for Simulation of Molecular Systems

    NASA Astrophysics Data System (ADS)

    Fortunato, Michael; Colina, Coray

    pysimm, short for python simulation interface for molecular modeling, is a python package designed to facilitate the structure generation and simulation of molecular systems through convenient and programmatic access to object-oriented representations of molecular system data. This poster presents core features of pysimm and design philosophies that highlight a generalized methodology for incorporation of third-party software packages through API interfaces. The integration with the LAMMPS simulation package is explained to demonstrate this methodology. pysimm began as a back-end python library that powered a cloud-based application on nanohub.org for amorphous polymer simulation. The extension from a specific application library to general purpose simulation interface is explained. Additionally, this poster highlights the rapid development of new applications to construct polymer chains capable of controlling chain morphology such as molecular weight distribution and monomer composition.

  18. A Methodology for Validating Safety Heuristics Using Clinical Simulations: Identifying and Preventing Possible Technology-Induced Errors Related to Using Health Information Systems

    PubMed Central

    Borycki, Elizabeth; Kushniruk, Andre; Carvalho, Christopher

    2013-01-01

    Internationally, health information systems (HIS) safety has emerged as a significant concern for governments. Recently, research has emerged that has documented the ability of HIS to be implicated in the harm and death of patients. Researchers have attempted to develop methods that can be used to prevent or reduce technology-induced errors. Some researchers are developing methods that can be employed prior to systems release. These methods include the development of safety heuristics and clinical simulations. In this paper, we outline our methodology for developing safety heuristics specific to identifying the features or functions of a HIS user interface design that may lead to technology-induced errors. We follow this with a description of a methodological approach to validate these heuristics using clinical simulations. PMID:23606902

  19. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  20. Assessing the Impact of Clothing and Individual Equipment (CIE) on Soldier Physical, Biomechanical, and Cognitive Performance Part 1: Test Methodology

    DTIC Science & Technology

    2018-02-01

    29 during Soldier Equipment Configuration Impact on Performance: Establishing a Test Methodology for the...Performance of Medium Rucksack Prototypes An investigation: Comparison of live-fire and weapon simulator test methodologies and the of three extremity armor

  1. ESP v1.0: Methodology for Exploring Emission Impacts of Future Scenarios in the United States

    EPA Science Inventory

    This article presents a methodology for creating anthropogenic emission inventories that can be used to simulate future regional air quality. The Emission Scenario Projection (ESP) methodology focuses on energy production and use, the principal sources of many air pollutants. Emi...

  2. Three-Dimensional Finite Element Ablative Thermal Response and Thermostructural Design of Thermal Protection Systems

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Braun, Robert D.

    2011-01-01

    A finite element ablation and thermal response program is presented for simulation of three-dimensional transient thermostructural analysis. The three-dimensional governing differential equations and finite element formulation are summarized. A novel probabilistic design methodology for thermal protection systems is presented. The design methodology is an eight step process beginning with a parameter sensitivity study and is followed by a deterministic analysis whereby an optimum design can determined. The design process concludes with a Monte Carlo simulation where the probabilities of exceeding design specifications are estimated. The design methodology is demonstrated by applying the methodology to the carbon phenolic compression pads of the Crew Exploration Vehicle. The maximum allowed values of bondline temperature and tensile stress are used as the design specifications in this study.

  3. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    NASA Astrophysics Data System (ADS)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  4. BSM2 Plant-Wide Model construction and comparative analysis with other methodologies for integrated modelling.

    PubMed

    Grau, P; Vanrolleghem, P; Ayesa, E

    2007-01-01

    In this paper, a new methodology for integrated modelling of the WWTP has been used for the construction of the Benchmark Simulation Model N degrees 2 (BSM2). The transformations-approach proposed in this methodology does not require the development of specific transformers to interface unit process models and allows the construction of tailored models for a particular WWTP guaranteeing the mass and charge continuity for the whole model. The BSM2 PWM constructed as case study, is evaluated by means of simulations under different scenarios and its validity in reproducing water and sludge lines in WWTP is demonstrated. Furthermore the advantages that this methodology presents compared to other approaches for integrated modelling are verified in terms of flexibility and coherence.

  5. CAGE IIIA Distributed Simulation Design Methodology

    DTIC Science & Technology

    2014-05-01

    2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this

  6. A new method for qualitative simulation of water resources systems: 1. Theory

    NASA Astrophysics Data System (ADS)

    Camara, A. S.; Pinheiro, M.; Antunes, M. P.; Seixas, M. J.

    1987-11-01

    A new dynamic modeling methodology, SLIN (Simulação Linguistica), allowing for the analysis of systems defined by linguistic variables, is presented. SLIN applies a set of logical rules avoiding fuzzy theoretic concepts. To make the transition from qualitative to quantitative modes, logical rules are used as well. Extensions of the methodology to simulation-optimization applications and multiexpert system modeling are also discussed.

  7. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  8. New scoring methodology improves the sensitivity of the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) in clinical trials.

    PubMed

    Verma, Nishant; Beretvas, S Natasha; Pascual, Belen; Masdeu, Joseph C; Markey, Mia K

    2015-11-12

    As currently used, the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) has low sensitivity for measuring Alzheimer's disease progression in clinical trials. A major reason behind the low sensitivity is its sub-optimal scoring methodology, which can be improved to obtain better sensitivity. Using item response theory, we developed a new scoring methodology (ADAS-CogIRT) for the ADAS-Cog, which addresses several major limitations of the current scoring methodology. The sensitivity of the ADAS-CogIRT methodology was evaluated using clinical trial simulations as well as a negative clinical trial, which had shown an evidence of a treatment effect. The ADAS-Cog was found to measure impairment in three cognitive domains of memory, language, and praxis. The ADAS-CogIRT methodology required significantly fewer patients and shorter trial durations as compared to the current scoring methodology when both were evaluated in simulated clinical trials. When validated on data from a real clinical trial, the ADAS-CogIRT methodology had higher sensitivity than the current scoring methodology in detecting the treatment effect. The proposed scoring methodology significantly improves the sensitivity of the ADAS-Cog in measuring progression of cognitive impairment in clinical trials focused in the mild-to-moderate Alzheimer's disease stage. This provides a boost to the efficiency of clinical trials requiring fewer patients and shorter durations for investigating disease-modifying treatments.

  9. Simulation of flashing signal operations.

    DOT National Transportation Integrated Search

    1982-01-01

    Various guidelines that have been proposed for the operation of traffic signals in the flashing mode were reviewed. The use of existing traffic simulation procedures to evaluate flashing signals was examined and a study methodology for simulating and...

  10. Simulation modeling of route guidance concept

    DOT National Transportation Integrated Search

    1997-01-01

    The methodology of a simulation model developed at the University of New South Wales, Australia, for the evaluation of performance of Dynamic Route Guidance Systems (DRGS) is described. The microscopic simulation model adopts the event update simulat...

  11. The Future Impact of Wind on BPA Power System Load Following and Regulation Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Lu, Shuai; McManus, Bart

    Wind power is growing in a very fast pace as an alternative generating resource. As the ratio of wind power over total system capacity increases, the impact of wind on various system aspects becomes significant. This paper presents a methodology to study the future impact of wind on BPA power system load following and regulation requirements. Existing methodologies for similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. The methodology proposed in this paper uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system. It mimics themore » actual power system operations therefore the results are close to reality yet the study based on this methodology is convenient to perform. The capacity, ramp rate and ramp duration characteristics are extracted from the simulation results. System load following and regulation capacity requirements are calculated accordingly. The ramp rate and ramp duration data obtained from the analysis can be used to evaluate generator response or maneuverability requirement and regulating units’ energy requirement, respectively.« less

  12. A Physics-Based Engineering Methodology for Calculating Soft Error Rates of Bulk CMOS and SiGe Heterojunction Bipolar Transistor Integrated Circuits

    NASA Astrophysics Data System (ADS)

    Fulkerson, David E.

    2010-02-01

    This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.

  13. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

  14. Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms

    NASA Technical Reports Server (NTRS)

    Kurdila, Andrew J.; Sharpley, Robert C.

    1999-01-01

    This paper presents a final report on Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms. The focus of this research is to derive and implement: 1) Wavelet based methodologies for the compression, transmission, decoding, and visualization of three dimensional finite element geometry and simulation data in a network environment; 2) methodologies for interactive algorithm monitoring and tracking in computational mechanics; and 3) Methodologies for interactive algorithm steering for the acceleration of large scale finite element simulations. Also included in this report are appendices describing the derivation of wavelet based Particle Image Velocity algorithms and reduced order input-output models for nonlinear systems by utilizing wavelet approximations.

  15. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  16. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    PubMed

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  17. Using Software Zelio Soft in Educational Process to Simulation Control Programs for Intelligent Relays

    NASA Astrophysics Data System (ADS)

    Michalik, Peter; Mital, Dusan; Zajac, Jozef; Brezikova, Katarina; Duplak, Jan; Hatala, Michal; Radchenko, Svetlana

    2016-10-01

    Article deals with point to using intelligent relay and PLC systems in practice, to their architecture and principles of programming and simulations for education process on all types of school from secondary to universities. Aim of the article is proposal of simple examples of applications, where is demonstrated methodology of programming on real simple practice examples and shown using of chosen instructions. In practical part is described process of creating schemas and describing of function blocks, where are described methodologies of creating program and simulations of output reactions on changeable inputs for intelligent relays.

  18. An automated procedure for developing hybrid computer simulations of turbofan engines

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Krosel, S. M.

    1980-01-01

    A systematic, computer-aided, self-documenting methodology for developing hybrid computer simulations of turbofan engines is presented. The methodology makes use of a host program that can run on a large digital computer and a machine-dependent target (hybrid) program. The host program performs all of the calculations and date manipulations needed to transform user-supplied engine design information to a form suitable for the hybrid computer. The host program also trims the self contained engine model to match specified design point information. A test case is described and comparisons between hybrid simulation and specified engine performance data are presented.

  19. Coupling Hydraulic Fracturing Propagation and Gas Well Performance for Simulation of Production in Unconventional Shale Gas Reservoirs

    NASA Astrophysics Data System (ADS)

    Wang, C.; Winterfeld, P. H.; Wu, Y. S.; Wang, Y.; Chen, D.; Yin, C.; Pan, Z.

    2014-12-01

    Hydraulic fracturing combined with horizontal drilling has made it possible to economically produce natural gas from unconventional shale gas reservoirs. An efficient methodology for evaluating hydraulic fracturing operation parameters, such as fluid and proppant properties, injection rates, and wellhead pressure, is essential for the evaluation and efficient design of these processes. Traditional numerical evaluation and optimization approaches are usually based on simulated fracture properties such as the fracture area. In our opinion, a methodology based on simulated production data is better, because production is the goal of hydraulic fracturing and we can calibrate this approach with production data that is already known. This numerical methodology requires a fully-coupled hydraulic fracture propagation and multi-phase flow model. In this paper, we present a general fully-coupled numerical framework to simulate hydraulic fracturing and post-fracture gas well performance. This three-dimensional, multi-phase simulator focuses on: (1) fracture width increase and fracture propagation that occurs as slurry is injected into the fracture, (2) erosion caused by fracture fluids and leakoff, (3) proppant subsidence and flowback, and (4) multi-phase fluid flow through various-scaled anisotropic natural and man-made fractures. Mathematical and numerical details on how to fully couple the fracture propagation and fluid flow parts are discussed. Hydraulic fracturing and production operation parameters, and properties of the reservoir, fluids, and proppants, are taken into account. The well may be horizontal, vertical, or deviated, as well as open-hole or cemented. The simulator is verified based on benchmarks from the literature and we show its application by simulating fracture network (hydraulic and natural fractures) propagation and production data history matching of a field in China. We also conduct a series of real-data modeling studies with different combinations of hydraulic fracturing parameters and present the methodology to design these operations with feedback of simulated production data. The unified model aids in the optimization of hydraulic fracturing design, operations, and production.

  20. Real time flight simulation methodology

    NASA Technical Reports Server (NTRS)

    Parrish, E. A.; Cook, G.; Mcvey, E. S.

    1977-01-01

    Substitutional methods for digitization, input signal-dependent integrator approximations, and digital autopilot design were developed. The software framework of a simulator design package is described. Included are subroutines for iterative designs of simulation models and a rudimentary graphics package.

  1. Cognitive simulators for medical education and training.

    PubMed

    Kahol, Kanav; Vankipuram, Mithra; Smith, Marshall L

    2009-08-01

    Simulators for honing procedural skills (such as surgical skills and central venous catheter placement) have proven to be valuable tools for medical educators and students. While such simulations represent an effective paradigm in surgical education, there is an opportunity to add a layer of cognitive exercises to these basic simulations that can facilitate robust skill learning in residents. This paper describes a controlled methodology, inspired by neuropsychological assessment tasks and embodied cognition, to develop cognitive simulators for laparoscopic surgery. These simulators provide psychomotor skill training and offer the additional challenge of accomplishing cognitive tasks in realistic environments. A generic framework for design, development and evaluation of such simulators is described. The presented framework is generalizable and can be applied to different task domains. It is independent of the types of sensors, simulation environment and feedback mechanisms that the simulators use. A proof of concept of the framework is provided through developing a simulator that includes cognitive variations to a basic psychomotor task. The results of two pilot studies are presented that show the validity of the methodology in providing an effective evaluation and learning environments for surgeons.

  2. Teaching and assessing procedural skills using simulation: metrics and methodology.

    PubMed

    Lammers, Richard L; Davenport, Moira; Korley, Frederick; Griswold-Theodorson, Sharon; Fitch, Michael T; Narang, Aneesh T; Evans, Leigh V; Gross, Amy; Rodriguez, Elliot; Dodge, Kelly L; Hamann, Cara J; Robey, Walter C

    2008-11-01

    Simulation allows educators to develop learner-focused training and outcomes-based assessments. However, the effectiveness and validity of simulation-based training in emergency medicine (EM) requires further investigation. Teaching and testing technical skills require methods and assessment instruments that are somewhat different than those used for cognitive or team skills. Drawing from work published by other medical disciplines as well as educational, behavioral, and human factors research, the authors developed six research themes: measurement of procedural skills; development of performance standards; assessment and validation of training methods, simulator models, and assessment tools; optimization of training methods; transfer of skills learned on simulator models to patients; and prevention of skill decay over time. The article reviews relevant and established educational research methodologies and identifies gaps in our knowledge of how physicians learn procedures. The authors present questions requiring further research that, once answered, will advance understanding of simulation-based procedural training and assessment in EM.

  3. Piloted Evaluation of an Integrated Methodology for Propulsion and Airframe Control Design

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.; Garg, Sanjay; Mattern, Duane L.; Ranaudo, Richard J.; Odonoghue, Dennis P.

    1994-01-01

    An integrated methodology for propulsion and airframe control has been developed and evaluated for a Short Take-Off Vertical Landing (STOVL) aircraft using a fixed base flight simulator at NASA Lewis Research Center. For this evaluation the flight simulator is configured for transition flight using a STOVL aircraft model, a full nonlinear turbofan engine model, simulated cockpit and displays, and pilot effectors. The paper provides a brief description of the simulation models, the flight simulation environment, the displays and symbology, the integrated control design, and the piloted tasks used for control design evaluation. In the simulation, the pilots successfully completed typical transition phase tasks such as combined constant deceleration with flight path tracking, and constant acceleration wave-off maneuvers. The pilot comments of the integrated system performance and the display symbology are discussed and analyzed to identify potential areas of improvement.

  4. GPS system simulation methodology

    NASA Technical Reports Server (NTRS)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  5. Real time simulation of computer-assisted sequencing of terminal area operations

    NASA Technical Reports Server (NTRS)

    Dear, R. G.

    1981-01-01

    A simulation was developed to investigate the utilization of computer assisted decision making for the task of sequencing and scheduling aircraft in a high density terminal area. The simulation incorporates a decision methodology termed Constrained Position Shifting. This methodology accounts for aircraft velocity profiles, routes, and weight classes in dynamically sequencing and scheduling arriving aircraft. A sample demonstration of Constrained Position Shifting is presented where six aircraft types (including both light and heavy aircraft) are sequenced to land at Denver's Stapleton International Airport. A graphical display is utilized and Constrained Position Shifting with a maximum shift of four positions (rearward or forward) is compared to first come, first serve with respect to arrival at the runway. The implementation of computer assisted sequencing and scheduling methodologies is investigated. A time based control concept will be required and design considerations for such a system are discussed.

  6. Feasibility of "Standardized Clinician" Methodology for Patient Training on Hospital-to-Home Transitions.

    PubMed

    Wehbe-Janek, Hania; Hochhalter, Angela K; Castilla, Theresa; Jo, Chanhee

    2015-02-01

    Patient engagement in health care is increasingly recognized as essential for promoting the health of individuals and populations. This study pilot tested the standardized clinician (SC) methodology, a novel adaptation of standardized patient methodology, for teaching patient engagement skills for the complex health care situation of transitioning from a hospital back to home. Sixty-seven participants at heightened risk for hospitalization were randomly assigned to either simulation exposure-only or full-intervention group. Both groups participated in simulation scenarios with "standardized clinicians" around tasks related to hospital discharge and follow-up. The full-intervention group was also debriefed after scenario sets and learned about tools for actively participating in hospital-to-home transitions. Measures included changes in observed behaviors at baseline and follow-up and an overall program evaluation. The full-intervention group showed increases in observed tool possession (P = 0.014) and expression of their preferences and values (P = 0.043). The simulation exposure-only group showed improvement in worksheet scores (P = 0.002) and fewer engagement skills (P = 0.021). Both groups showed a decrease in telling an SC about their hospital admission (P < 0.05). Open-ended comments from the program evaluation were largely positive. Both groups benefited from exposure to the SC intervention. Program evaluation data suggest that simulation training is feasible and may provide a useful methodology for teaching patient skills for active engagement in health care. Future studies are warranted to determine if this methodology can be used to assess overall patient engagement and whether new patient learning transfers to health care encounters.

  7. Operational Impacts of Wind Energy Resources in the Bonneville Power Administration Control Area - Phase I Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Lu, Shuai

    2008-07-15

    This report presents a methodology developed to study the future impact of wind on BPA power system load following and regulation requirements. The methodology uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system, by mimicking the actual power system operations. Therefore, the results are close to reality, yet the study based on this methodology is convenient to conduct. Compared with the proposed methodology, existing methodologies for doing similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. Dispatch model simulation is constrained by the design of themore » dispatch program, and standard deviation evaluation is artificial in separating the load following and regulation requirements, both of which usually do not reflect actual operational practice. The methodology used in this study provides not only capacity requirement information, it also analyzes the ramp rate requirements for system load following and regulation processes. The ramp rate data can be used to evaluate generator response/maneuverability requirements, which is another necessary capability of the generation fleet for the smooth integration of wind energy. The study results are presented in an innovative way such that the increased generation capacity or ramp requirements are compared for two different years, across 24 hours a day. Therefore, the impact of different levels of wind energy on generation requirements at different times can be easily visualized.« less

  8. Accurate electrical prediction of memory array through SEM-based edge-contour extraction using SPICE simulation

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan; Rotstein, Israel; Peltinov, Ram; Latinski, Sergei; Adan, Ofer; Levi, Shimon; Menadeva, Ovadya

    2009-03-01

    The continues transistors scaling efforts, for smaller devices, similar (or larger) drive current/um and faster devices, increase the challenge to predict and to control the transistor off-state current. Typically, electrical simulators like SPICE, are using the design intent (as-drawn GDS data). At more sophisticated cases, the simulators are fed with the pattern after lithography and etch process simulations. As the importance of electrical simulation accuracy is increasing and leakage is becoming more dominant, there is a need to feed these simulators, with more accurate information extracted from physical on-silicon transistors. Our methodology to predict changes in device performances due to systematic lithography and etch effects was used in this paper. In general, the methodology consists on using the OPCCmaxTM for systematic Edge-Contour-Extraction (ECE) from transistors, taking along the manufacturing and includes any image distortions like line-end shortening, corner rounding and line-edge roughness. These measurements are used for SPICE modeling. Possible application of this new metrology is to provide a-head of time, physical and electrical statistical data improving time to market. In this work, we applied our methodology to analyze a small and large array's of 2.14um2 6T-SRAM, manufactured using Tower Standard Logic for General Purposes Platform. 4 out of the 6 transistors used "U-Shape AA", known to have higher variability. The predicted electrical performances of the transistors drive current and leakage current, in terms of nominal values and variability are presented. We also used the methodology to analyze an entire SRAM Block array. Study of an isolation leakage and variability are presented.

  9. Using simulation to study difficult clinical issues: prenatal counseling at the threshold of viability across American and Dutch cultures.

    PubMed

    Geurtzen, Rosa; Hogeveen, Marije; Rajani, Anand K; Chitkara, Ritu; Antonius, Timothy; van Heijst, Arno; Draaisma, Jos; Halamek, Louis P

    2014-06-01

    Prenatal counseling at the threshold of viability is a challenging yet critically important activity, and care guidelines differ across cultures. Studying how this task is performed in the actual clinical environment is extremely difficult. In this pilot study, we used simulation as a methodology with 2 aims as follows: first, to explore the use of simulation incorporating a standardized pregnant patient as an investigative methodology and, second, to determine similarities and differences in content and style of prenatal counseling between American and Dutch neonatologists. We compared counseling practice between 11 American and 11 Dutch neonatologists, using a simulation-based investigative methodology. All subjects performed prenatal counseling with a simulated pregnant patient carrying a fetus at the limits of viability. The following elements of scenario design were standardized across all scenarios: layout of the physical environment, details of the maternal and fetal histories, questions and responses of the standardized pregnant patient, and the time allowed for consultation. American subjects typically presented several treatment options without bias, whereas Dutch subjects were more likely to explicitly advise a specific course of treatment (emphasis on partial life support). American subjects offered comfort care more frequently than the Dutch subjects and also discussed options for maximal life support more often than their Dutch colleagues. Simulation is a useful research methodology for studying activities difficult to assess in the actual clinical environment such as prenatal counseling at the limits of viability. Dutch subjects were more directive in their approach than their American counterparts, offering fewer options for care and advocating for less invasive interventions. American subjects were more likely to offer a wider range of therapeutic options without providing a recommendation for any specific option.

  10. Probabilistic Simulation of Stress Concentration in Composite Laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.

    1994-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.

  11. Mesoscopic Simulations of Crosslinked Polymer Networks

    NASA Astrophysics Data System (ADS)

    Megariotis, Grigorios; Vogiatzis, Georgios G.; Schneider, Ludwig; Müller, Marcus; Theodorou, Doros N.

    2016-08-01

    A new methodology and the corresponding C++ code for mesoscopic simulations of elastomers are presented. The test system, crosslinked ds-1’4-polyisoprene’ is simulated with a Brownian Dynamics/kinetic Monte Carlo algorithm as a dense liquid of soft, coarse-grained beads, each representing 5-10 Kuhn segments. From the thermodynamic point of view, the system is described by a Helmholtz free-energy containing contributions from entropic springs between successive beads along a chain, slip-springs representing entanglements between beads on different chains, and non-bonded interactions. The methodology is employed for the calculation of the stress relaxation function from simulations of several microseconds at equilibrium, as well as for the prediction of stress-strain curves of crosslinked polymer networks under deformation.

  12. Q-Sample Construction: A Critical Step for a Q-Methodological Study.

    PubMed

    Paige, Jane B; Morin, Karen H

    2016-01-01

    Q-sample construction is a critical step in Q-methodological studies. Prior to conducting Q-studies, researchers start with a population of opinion statements (concourse) on a particular topic of interest from which a sample is drawn. These sampled statements are known as the Q-sample. Although literature exists on methodological processes to conduct Q-methodological studies, limited guidance exists on the practical steps to reduce the population of statements to a Q-sample. A case exemplar illustrates the steps to construct a Q-sample in preparation for a study that explored perspectives nurse educators and nursing students hold about simulation design. Experts in simulation and Q-methodology evaluated the Q-sample for readability, clarity, and for representativeness of opinions contained within the concourse. The Q-sample was piloted and feedback resulted in statement refinement. Researchers especially those undertaking Q-method studies for the first time may benefit from the practical considerations to construct a Q-sample offered in this article. © The Author(s) 2014.

  13. Educational Validity of Business Gaming Simulation: A Research Methodology Framework

    ERIC Educational Resources Information Center

    Stainton, Andrew J.; Johnson, Johnnie E.; Borodzicz, Edward P.

    2010-01-01

    Many past educational validity studies of business gaming simulation, and more specifically total enterprise simulation, have been inconclusive. Studies have focused on the weaknesses of business gaming simulation; which is often regarded as an educational medium that has limitations regarding learning effectiveness. However, no attempts have been…

  14. Integrated corridor management analysis, modeling and simulation (AMS) methodology.

    DOT National Transportation Integrated Search

    2008-03-01

    This AMS Methodologies Document provides a discussion of potential ICM analytical approaches for the assessment of generic corridor operations. The AMS framework described in this report identifies strategies and procedures for tailoring AMS general ...

  15. Calibration of CORSIM models under saturated traffic flow conditions.

    DOT National Transportation Integrated Search

    2013-09-01

    This study proposes a methodology to calibrate microscopic traffic flow simulation models. : The proposed methodology has the capability to calibrate simultaneously all the calibration : parameters as well as demand patterns for any network topology....

  16. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  17. Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation Dynamics

    NASA Technical Reports Server (NTRS)

    Toniolo, Matthew D.; Tartabini, Paul V.; Pamadi, Bandu N.; Hotchko, Nathaniel

    2008-01-01

    This paper discusses a generalized approach to the multi-body separation problems in a launch vehicle staging environment based on constraint force methodology and its implementation into the Program to Optimize Simulated Trajectories II (POST2), a widely used trajectory design and optimization tool. This development facilitates the inclusion of stage separation analysis into POST2 for seamless end-to-end simulations of launch vehicle trajectories, thus simplifying the overall implementation and providing a range of modeling and optimization capabilities that are standard features in POST2. Analysis and results are presented for two test cases that validate the constraint force equation methodology in a stand-alone mode and its implementation in POST2.

  18. Dynamic Decision Making under Uncertainty and Partial Information

    DTIC Science & Technology

    2017-01-30

    order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those

  19. Hafnium transistor process design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2009-01-01

    A design methodology is presented that uses 1-D process simulations of Metal Insulator Semiconductor (MIS) structures to design the threshold voltage of hafnium oxide based transistors used for neural recording. The methodology is comprised of 1-D analytical equations for threshold voltage specification, and doping profiles, and 1-D MIS Technical Computer Aided Design (TCAD) to design a process to implement a specific threshold voltage, which minimized simulation time. The process was then verified with a 2-D process/electrical TCAD simulation. Hafnium oxide films (HfO) were grown and characterized for dielectric constant and fixed oxide charge for various annealing temperatures, two important design variables in threshold voltage design.

  20. Bio-inspired algorithms applied to molecular docking simulations.

    PubMed

    Heberlé, G; de Azevedo, W F

    2011-01-01

    Nature as a source of inspiration has been shown to have a great beneficial impact on the development of new computational methodologies. In this scenario, analyses of the interactions between a protein target and a ligand can be simulated by biologically inspired algorithms (BIAs). These algorithms mimic biological systems to create new paradigms for computation, such as neural networks, evolutionary computing, and swarm intelligence. This review provides a description of the main concepts behind BIAs applied to molecular docking simulations. Special attention is devoted to evolutionary algorithms, guided-directed evolutionary algorithms, and Lamarckian genetic algorithms. Recent applications of these methodologies to protein targets identified in the Mycobacterium tuberculosis genome are described.

  1. Quantum dynamical simulation of photoinduced electron transfer processes in dye-semiconductor systems: theory and application to coumarin 343 at TiO₂.

    PubMed

    Li, Jingrui; Kondov, Ivan; Wang, Haobin; Thoss, Michael

    2015-04-10

    A recently developed methodology to simulate photoinduced electron transfer processes at dye-semiconductor interfaces is outlined. The methodology employs a first-principles-based model Hamiltonian and accurate quantum dynamics simulations using the multilayer multiconfiguration time-dependent Hartree approach. This method is applied to study electron injection in the dye-semiconductor system coumarin 343-TiO2. Specifically, the influence of electronic-vibrational coupling is analyzed. Extending previous work, we consider the influence of Dushinsky rotation of the normal modes as well as anharmonicities of the potential energy surfaces on the electron transfer dynamics.

  2. Adaptive resolution simulation of a biomolecule and its hydration shell: Structural and dynamical properties

    NASA Astrophysics Data System (ADS)

    Fogarty, Aoife C.; Potestio, Raffaello; Kremer, Kurt

    2015-05-01

    A fully atomistic modelling of many biophysical and biochemical processes at biologically relevant length- and time scales is beyond our reach with current computational resources, and one approach to overcome this difficulty is the use of multiscale simulation techniques. In such simulations, when system properties necessitate a boundary between resolutions that falls within the solvent region, one can use an approach such as the Adaptive Resolution Scheme (AdResS), in which solvent particles change their resolution on the fly during the simulation. Here, we apply the existing AdResS methodology to biomolecular systems, simulating a fully atomistic protein with an atomistic hydration shell, solvated in a coarse-grained particle reservoir and heat bath. Using as a test case an aqueous solution of the regulatory protein ubiquitin, we first confirm the validity of the AdResS approach for such systems, via an examination of protein and solvent structural and dynamical properties. We then demonstrate how, in addition to providing a computational speedup, such a multiscale AdResS approach can yield otherwise inaccessible physical insights into biomolecular function. We use our methodology to show that protein structure and dynamics can still be correctly modelled using only a few shells of atomistic water molecules. We also discuss aspects of the AdResS methodology peculiar to biomolecular simulations.

  3. Concentration and spatial distribution of lead in soil used for ammunition destruction.

    PubMed

    do Nascimento Guedes, Jair; do Amaral Sobrinho, Nelson Moura Brasil; Ceddia, Marcos Bacis; Vilella, André Luis Oliveira; Tolón-Becerra, Alfredo; Lastra-Bravo, Xavier Bolívar

    2012-10-01

    Studies on heavy metal contamination in soils used for ammunition disposal and destruction are still emerging. The present study aimed to evaluate the contamination level and spatial distribution of lead in disposal and destruction areas. This site was used for ammunition disposal and destruction activities for 20 years. The ammunition destruction site (1,296 ha), a sampling system that followed a sampling grid (5 m × 5 m) with 30 points was adopted and samples were collected at the following five depths with a total of 150 samples. During the collection procedure, each sampling grid point was georeferenced using a topographic global positioning system. Data were validated through semivariogram and kriging models using Geostat software. The results demonstrated that the average lead value was 163 mg kg(-1), which was close to the investigation limit and the contamination levels were higher downstream than upstream. The results showed that there was lead contamination at the destruction site and that the contamination existed mainly at the surface layer depth. However, high lead concentrations were also found at deeper soil depths in the destruction area due to frequent detonations. According to the planimetry data, the areas that require intervention significantly decreased with increasing depths in the following order: 582.7 m(2) in the 0-20 cm layer; 194.6 m(2) in the 20-40 cm layer; 101.6 m(2) in the 40-60 cm layer; and 45.3 m(2) in the 60-80 cm layer.

  4. Multibody simulation of vehicles equipped with an automatic transmission

    NASA Astrophysics Data System (ADS)

    Olivier, B.; Kouroussis, G.

    2016-09-01

    Nowadays automotive vehicles remain as one of the most used modes of transportation. Furthermore automatic transmissions are increasingly used to provide a better driving comfort and a potential optimization of the engine performances (by placing the gear shifts at specific engine and vehicle speeds). This paper presents an effective modeling of the vehicle using the multibody methodology (numerically computed under EasyDyn, an open source and in-house library dedicated to multibody simulations). However, the transmission part of the vehicle is described by the usual equations of motion computed using a systematic matrix approach: del Castillo's methodology for planetary gear trains. By coupling the analytic equations of the transmission and the equations computed by the multibody methodology, the performances of any vehicle can be obtained if the characteristics of each element in the vehicle are known. The multibody methodology offers the possibilities to develop the vehicle modeling from 1D-motion to 3D-motion by taking into account the rotations and implementing tire models. The modeling presented in this paper remains very efficient and provides an easy and quick vehicle simulation tool which could be used in order to calibrate the automatic transmission.

  5. Generic simulation of multi-element ladar scanner kinematics in USU LadarSIM

    NASA Astrophysics Data System (ADS)

    Omer, David; Call, Benjamin; Pack, Robert; Fullmer, Rees

    2006-05-01

    This paper presents a generic simulation model for a ladar scanner with up to three scan elements, each having a steering, stabilization and/or pattern-scanning role. Of interest is the development of algorithms that automatically generate commands to the scan elements given beam-steering objectives out of the ladar aperture, and the base motion of the sensor platform. First, a straight-forward single-element body-fixed beam-steering methodology is presented. Then a unique multi-element redirective and reflective space-fixed beam-steering methodology is explained. It is shown that standard direction cosine matrix decomposition methods fail when using two orthogonal, space-fixed rotations, thus demanding the development of a new algorithm for beam steering. Finally, a related steering control methodology is presented that uses two separate optical elements mathematically combined to determine the necessary scan element commands. Limits, restrictions, and results on this methodology are presented.

  6. Use of Computer Simulation for the Analysis of Railroad Operations in the St. Louis Terminal Area

    DOT National Transportation Integrated Search

    1977-11-01

    This report discusses the computer simulation methodology, its uses and limitations, and its applicability to the analysis of alternative railroad terminal restructuring plans. Included is a detailed discussion of the AAR Simulation System, an overvi...

  7. PERFORMANCE, RELIABILITY, AND IMPROVEMENT OF A TISSUE-SPECIFIC METABOLIC SIMULATOR

    EPA Science Inventory

    A methodology is described that has been used to build and enhance a simulator for rat liver metabolism providing reliable predictions within a large chemical domain. The tissue metabolism simulator (TIMES) utilizes a heuristic algorithm to generate plausible metabolic maps using...

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fogarty, Aoife C., E-mail: fogarty@mpip-mainz.mpg.de; Potestio, Raffaello, E-mail: potestio@mpip-mainz.mpg.de; Kremer, Kurt, E-mail: kremer@mpip-mainz.mpg.de

    A fully atomistic modelling of many biophysical and biochemical processes at biologically relevant length- and time scales is beyond our reach with current computational resources, and one approach to overcome this difficulty is the use of multiscale simulation techniques. In such simulations, when system properties necessitate a boundary between resolutions that falls within the solvent region, one can use an approach such as the Adaptive Resolution Scheme (AdResS), in which solvent particles change their resolution on the fly during the simulation. Here, we apply the existing AdResS methodology to biomolecular systems, simulating a fully atomistic protein with an atomistic hydrationmore » shell, solvated in a coarse-grained particle reservoir and heat bath. Using as a test case an aqueous solution of the regulatory protein ubiquitin, we first confirm the validity of the AdResS approach for such systems, via an examination of protein and solvent structural and dynamical properties. We then demonstrate how, in addition to providing a computational speedup, such a multiscale AdResS approach can yield otherwise inaccessible physical insights into biomolecular function. We use our methodology to show that protein structure and dynamics can still be correctly modelled using only a few shells of atomistic water molecules. We also discuss aspects of the AdResS methodology peculiar to biomolecular simulations.« less

  9. Optical modeling based on mean free path calculations for quantum dot phosphors applied to optoelectronic devices.

    PubMed

    Shin, Min-Ho; Kim, Hyo-Jun; Kim, Young-Joo

    2017-02-20

    We proposed an optical simulation model for the quantum dot (QD) nanophosphor based on the mean free path concept to understand precisely the optical performance of optoelectronic devices. A measurement methodology was also developed to get the desired optical characteristics such as the mean free path and absorption spectra for QD nanophosphors which are to be incorporated into the simulation. The simulation results for QD-based white LED and OLED displays show good agreement with the experimental values from the fabricated devices in terms of spectral power distribution, chromaticity coordinate, CCT, and CRI. The proposed simulation model and measurement methodology can be applied easily to the design of lots of optoelectronics devices using QD nanophosphors to obtain high efficiency and the desired color characteristics.

  10. Probabilistic simulation of stress concentration in composite laminates

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Liaw, L.

    1993-01-01

    A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.

  11. Computational Simulation of the Formation and Material Behavior of Ice

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Computational methods are described for simulating the formation and the material behavior of ice in prevailing transient environments. The methodology developed at the NASA Lewis Research Center was adopted. A three dimensional finite-element heat transfer analyzer was used to predict the thickness of ice formed under prevailing environmental conditions. A multi-factor interaction model for simulating the material behavior of time-variant ice layers is presented. The model, used in conjunction with laminated composite mechanics, updates the material properties of an ice block as its thickness increases with time. A sample case of ice formation in a body of water was used to demonstrate the methodology. The results showed that the formation and the material behavior of ice can be computationally simulated using the available composites technology.

  12. Simulating Afterburn with LLNL Hydrocodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, L D

    2004-06-11

    Presented here is a working methodology for adapting a Lawrence Livermore National Laboratory (LLNL) developed hydrocode, ALE3D, to simulate weapon damage effects when afterburn is a consideration in the blast propagation. Experiments have shown that afterburn is of great consequence in enclosed environments (i.e. bomb in tunnel scenario, penetrating conventional munition in a bunker, or satchel charge placed in a deep underground facility). This empirical energy deposition methodology simulates the anticipated addition of kinetic energy that has been demonstrated by experiment (Kuhl, et. al. 1998), without explicitly solving the chemistry, or resolving the mesh to capture small-scale vorticity. This effortmore » is intended to complement the existing capability of either coupling ALE3D blast simulations with DYNA3D or performing fully coupled ALE3D simulations to predict building or component failure, for applications in National Security offensive strike planning as well as Homeland Defense infrastructure protection.« less

  13. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    NASA Astrophysics Data System (ADS)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.

  14. Fidelity assessment of a UH-60A simulation on the NASA Ames vertical motion simulator

    NASA Technical Reports Server (NTRS)

    Atencio, Adolph, Jr.

    1993-01-01

    Helicopter handling qualities research requires that a ground-based simulation be a high-fidelity representation of the actual helicopter, especially over the frequency range of the investigation. This experiment was performed to assess the current capability to simulate the UH-60A Black Hawk helicopter on the Vertical Motion Simulator (VMS) at NASA Ames, to develop a methodology for assessing the fidelity of a simulation, and to find the causes for lack of fidelity. The approach used was to compare the simulation to the flight vehicle for a series of tasks performed in flight and in the simulator. The results show that subjective handling qualities ratings from flight to simulator overlap, and the mathematical model matches the UH-60A helicopter very well over the range of frequencies critical to handling qualities evaluation. Pilot comments, however, indicate a need for improvement in the perceptual fidelity of the simulation in the areas of motion and visual cuing. The methodology used to make the fidelity assessment proved useful in showing differences in pilot work load and strategy, but additional work is needed to refine objective methods for determining causes of lack of fidelity.

  15. Fuzzy inductive reasoning: a consolidated approach to data-driven construction of complex dynamical systems

    NASA Astrophysics Data System (ADS)

    Nebot, Àngela; Mugica, Francisco

    2012-10-01

    Fuzzy inductive reasoning (FIR) is a modelling and simulation methodology derived from the General Systems Problem Solver. It compares favourably with other soft computing methodologies, such as neural networks, genetic or neuro-fuzzy systems, and with hard computing methodologies, such as AR, ARIMA, or NARMAX, when it is used to predict future behaviour of different kinds of systems. This paper contains an overview of the FIR methodology, its historical background, and its evolution.

  16. Methodology for locating defects within hardwood logs and determining their impact on lumber-value yield

    Treesearch

    Thomas Harless; Francis G. Wagner; Phillip Steele; Fred Taylor; Vikram Yadama; Charles W. McMillin

    1991-01-01

    A precise research methodology is described by which internal log-defect locations may help select hardwood log ortentation and sawing procedure to improve lumber value. Procedures for data collection, data handling, simulated sawing, and data analysis are described. A single test log verified the methodology. Results from this log showed significant differences in...

  17. The Atomic Intrinsic Integration Approach: A Structured Methodology for the Design of Games for the Conceptual Understanding of Physics

    ERIC Educational Resources Information Center

    Echeverria, Alejandro; Barrios, Enrique; Nussbaum, Miguel; Amestica, Matias; Leclerc, Sandra

    2012-01-01

    Computer simulations combined with games have been successfully used to teach conceptual physics. However, there is no clear methodology for guiding the design of these types of games. To remedy this, we propose a structured methodology for the design of conceptual physics games that explicitly integrates the principles of the intrinsic…

  18. The methodology for modeling queuing systems using Petri nets

    NASA Astrophysics Data System (ADS)

    Kotyrba, Martin; Gaj, Jakub; Tvarůžka, Matouš

    2017-07-01

    This papers deals with the use of Petri nets in modeling and simulation of queuing systems. The first part is focused on the explanation of basic concepts and properties of Petri nets and queuing systems. The proposed methodology for the modeling of queuing systems using Petri nets is described in the practical part. The proposed methodology will be tested on specific cases.

  19. Application of an Integrated Methodology for Propulsion and Airframe Control Design to a STOVL Aircraft

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Mattern, Duane

    1994-01-01

    An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.

  20. Methodology to evaluate the performance of simulation models for alternative compiler and operating system configurations

    USDA-ARS?s Scientific Manuscript database

    Simulation modelers increasingly require greater flexibility for model implementation on diverse operating systems, and they demand high computational speed for efficient iterative simulations. Additionally, model users may differ in preference for proprietary versus open-source software environment...

  1. Simulation/Gaming and the Acquisition of Communicative Competence in Another Language.

    ERIC Educational Resources Information Center

    Garcia-Carbonell, Amparo; Rising, Beverly; Montero, Begona; Watts, Frances

    2001-01-01

    Discussion of communicative competence in second language acquisition focuses on a theoretical and practical meshing of simulation and gaming methodology with theories of foreign language acquisition, including task-based learning, interaction, and comprehensible input. Describes experiments conducted with computer-assisted simulations in…

  2. Diversity of nursing student views about simulation design: a q-methodological study.

    PubMed

    Paige, Jane B; Morin, Karen H

    2015-05-01

    Education of future nurses benefits from well-designed simulation activities. Skillful teaching with simulation requires educators to be constantly aware of how students experience learning and perceive educators' actions. Because revision of simulation activities considers feedback elicited from students, it is crucial to understand the perspective from which students base their response. In a Q-methodological approach, 45 nursing students rank-ordered 60 opinion statements about simulation design into a distribution grid. Factor analysis revealed that nursing students hold five distinct and uniquely personal perspectives-Let Me Show You, Stand By Me, The Agony of Defeat, Let Me Think It Through, and I'm Engaging and So Should You. Results suggest that nurse educators need to reaffirm that students clearly understand the purpose of each simulation activity. Nurse educators should incorporate presimulation assignments to optimize learning and help allay anxiety. The five perspectives discovered in this study can serve as a tool to discern individual students' learning needs. Copyright 2015, SLACK Incorporated.

  3. Mathematical simulation of power conditioning systems. Volume 1: Simulation of elementary units. Report on simulation methodology

    NASA Technical Reports Server (NTRS)

    Prajous, R.; Mazankine, J.; Ippolito, J. C.

    1978-01-01

    Methods and algorithms used for the simulation of elementary power conditioning units buck, boost, and buck-boost, as well as shunt PWM are described. Definitions are given of similar converters and reduced parameters. The various parts of the simulation to be carried out are dealt with; local stability, corrective network, measurements of input-output impedance and global stability. A simulation example is given.

  4. Large-Eddy Simulation (LES) of a Compressible Mixing Layer and the Significance of Inflow Turbulence

    NASA Technical Reports Server (NTRS)

    Mankbadi, Mina Reda; Georgiadis, Nicholas J.; Debonis, James R.

    2017-01-01

    In the context of Large Eddy Simulations (LES), the effects of inflow turbulence are investigated through the Synthetic Eddy Method (SEM). The growth rate of a turbulent compressible mixing layer corresponding to operating conditions of GeobelDutton Case 2 is investigated herein. The effects of spanwise width on the growth rate of the mixing layer is investigated such that spanwise width independence is reached. The error in neglecting inflow turbulence effects is quantified by comparing two methodologies: (1) Hybrid-RANS-LES methodology and (2) SEM-LES methodology. Best practices learned from Case 2 are developed herein and then applied to a higher convective mach number corresponding to Case 4 experiments of GeobelDutton.

  5. Combining numerical simulations with time-domain random walk for pathogen risk assessment in groundwater

    NASA Astrophysics Data System (ADS)

    Cvetkovic, V.; Molin, S.

    2012-02-01

    We present a methodology that combines numerical simulations of groundwater flow and advective transport in heterogeneous porous media with analytical retention models for computing the infection risk probability from pathogens in aquifers. The methodology is based on the analytical results presented in [1,2] for utilising the colloid filtration theory in a time-domain random walk framework. It is shown that in uniform flow, the results from the numerical simulations of advection yield comparable results as the analytical TDRW model for generating advection segments. It is shown that spatial variability of the attachment rate may be significant, however, it appears to affect risk in a different manner depending on if the flow is uniform or radially converging. In spite of the fact that numerous issues remain open regarding pathogen transport in aquifers on the field scale, the methodology presented here may be useful for screening purposes, and may also serve as a basis for future studies that would include greater complexity.

  6. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation. Volume 2, Part 2: Appendixes B, C, D and E

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    The derivation of the equations is presented, the rate control algorithm described, and simulation methodologies summarized. A set of dynamics equations that can be used recursively to calculate forces and torques acting at the joints of an n link manipulator given the manipulator joint rates are derived. The equations are valid for any n link manipulator system with any kind of joints connected in any sequence. The equations of motion for the class of manipulators consisting of n rigid links interconnected by rotary joints are derived. A technique is outlined for reducing the system of equations to eliminate contraint torques. The linearized dynamics equations for an n link manipulator system are derived. The general n link linearized equations are then applied to a two link configuration. The coordinated rate control algorithm used to compute individual joint rates when given end effector rates is described. A short discussion of simulation methodologies is presented.

  7. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  8. An Energy Storage Assessment: Using Optimal Control Strategies to Capture Multiple Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Di; Jin, Chunlian; Balducci, Patrick J.

    2015-09-01

    This paper presents a methodology for evaluating benefits of battery storage for multiple grid applications, including energy arbitrage, balancing service, capacity value, distribution system equipment deferral, and outage mitigation. In the proposed method, at each hour, a look-ahead optimization is first formulated and solved to determine battery base operating point. The minute by minute simulation is then performed to simulate the actual battery operation. This methodology is used to assess energy storage alternatives in Puget Sound Energy System. Different battery storage candidates are simulated for a period of one year to assess different value streams and overall benefits, as partmore » of a financial feasibility evaluation of battery storage projects.« less

  9. Simulation: an evolving methodology for health administration education.

    PubMed

    Taylor, J K; Moore, J A; Holland, M G

    1985-01-01

    Simulation provides a valuable addition to a university's teaching methods. Computer-assisted gaming is especially effective in teaching advanced business strategy and corporate policy when the nature and complexity of the simulation permit. The potential for using simulation techniques in postgraduate professional education and in managerial self-assessment appears to be significant over the next several years.

  10. Uncertainty propagation by using spectral methods: A practical application to a two-dimensional turbulence fluid model

    NASA Astrophysics Data System (ADS)

    Riva, Fabio; Milanese, Lucio; Ricci, Paolo

    2017-10-01

    To reduce the computational cost of the uncertainty propagation analysis, which is used to study the impact of input parameter variations on the results of a simulation, a general and simple to apply methodology based on decomposing the solution to the model equations in terms of Chebyshev polynomials is discussed. This methodology, based on the work by Scheffel [Am. J. Comput. Math. 2, 173-193 (2012)], approximates the model equation solution with a semi-analytic expression that depends explicitly on time, spatial coordinates, and input parameters. By employing a weighted residual method, a set of nonlinear algebraic equations for the coefficients appearing in the Chebyshev decomposition is then obtained. The methodology is applied to a two-dimensional Braginskii model used to simulate plasma turbulence in basic plasma physics experiments and in the scrape-off layer of tokamaks, in order to study the impact on the simulation results of the input parameter that describes the parallel losses. The uncertainty that characterizes the time-averaged density gradient lengths, time-averaged densities, and fluctuation density level are evaluated. A reasonable estimate of the uncertainty of these distributions can be obtained with a single reduced-cost simulation.

  11. Passenger rail vehicle safety assessment methodology. Volume II, Detailed analyses and simulation results.

    DOT National Transportation Integrated Search

    2000-04-01

    This report presents detailed analytic tools and results on dynamic response which are used to develop the safe dynamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical parameters and characteris...

  12. Towards an in-plane methodology to track breast lesions using mammograms and patient-specific finite-element simulations

    NASA Astrophysics Data System (ADS)

    Lapuebla-Ferri, Andrés; Cegoñino-Banzo, José; Jiménez-Mocholí, Antonio-José; Pérez del Palomar, Amaya

    2017-11-01

    In breast cancer screening or diagnosis, it is usual to combine different images in order to locate a lesion as accurately as possible. These images are generated using a single or several imaging techniques. As x-ray-based mammography is widely used, a breast lesion is located in the same plane of the image (mammogram), but tracking it across mammograms corresponding to different views is a challenging task for medical physicians. Accordingly, simulation tools and methodologies that use patient-specific numerical models can facilitate the task of fusing information from different images. Additionally, these tools need to be as straightforward as possible to facilitate their translation to the clinical area. This paper presents a patient-specific, finite-element-based and semi-automated simulation methodology to track breast lesions across mammograms. A realistic three-dimensional computer model of a patient’s breast was generated from magnetic resonance imaging to simulate mammographic compressions in cranio-caudal (CC, head-to-toe) and medio-lateral oblique (MLO, shoulder-to-opposite hip) directions. For each compression being simulated, a virtual mammogram was obtained and posteriorly superimposed to the corresponding real mammogram, by sharing the nipple as a common feature. Two-dimensional rigid-body transformations were applied, and the error distance measured between the centroids of the tumors previously located on each image was 3.84 mm and 2.41 mm for CC and MLO compression, respectively. Considering that the scope of this work is to conceive a methodology translatable to clinical practice, the results indicate that it could be helpful in supporting the tracking of breast lesions.

  13. Simulation of nitrate reduction in groundwater - An upscaling approach from small catchments to the Baltic Sea basin

    NASA Astrophysics Data System (ADS)

    Hansen, A. L.; Donnelly, C.; Refsgaard, J. C.; Karlsson, I. B.

    2018-01-01

    This paper describes a modeling approach proposed to simulate the impact of local-scale, spatially targeted N-mitigation measures for the Baltic Sea Basin. Spatially targeted N-regulations aim at exploiting the considerable spatial differences in the natural N-reduction taking place in groundwater and surface water. While such measures can be simulated using local-scale physically-based catchment models, use of such detailed models for the 1.8 million km2 Baltic Sea basin is not feasible due to constraints on input data and computing power. Large-scale models that are able to simulate the Baltic Sea basin, on the other hand, do not have adequate spatial resolution to simulate some of the field-scale measures. Our methodology combines knowledge and results from two local-scale physically-based MIKE SHE catchment models, the large-scale and more conceptual E-HYPE model, and auxiliary data in order to enable E-HYPE to simulate how spatially targeted regulation of agricultural practices may affect N-loads to the Baltic Sea. We conclude that the use of E-HYPE with this upscaling methodology enables the simulation of the impact on N-loads of applying a spatially targeted regulation at the Baltic Sea basin scale to the correct order-of-magnitude. The E-HYPE model together with the upscaling methodology therefore provides a sound basis for large-scale policy analysis; however, we do not expect it to be sufficiently accurate to be useful for the detailed design of local-scale measures.

  14. Creating executable architectures using Visual Simulation Objects (VSO)

    NASA Astrophysics Data System (ADS)

    Woodring, John W.; Comiskey, John B.; Petrov, Orlin M.; Woodring, Brian L.

    2005-05-01

    Investigations have been performed to identify a methodology for creating executable models of architectures and simulations of architecture that lead to an understanding of their dynamic properties. Colored Petri Nets (CPNs) are used to describe architecture because of their strong mathematical foundations, the existence of techniques for their verification and graph theory"s well-established history of success in modern science. CPNs have been extended to interoperate with legacy simulations via a High Level Architecture (HLA) compliant interface. It has also been demonstrated that an architecture created as a CPN can be integrated with Department of Defense Architecture Framework products to ensure consistency between static and dynamic descriptions. A computer-aided tool, Visual Simulation Objects (VSO), which aids analysts in specifying, composing and executing architectures, has been developed to verify the methodology and as a prototype commercial product.

  15. Contact Line Dynamics

    NASA Astrophysics Data System (ADS)

    Kreiss, Gunilla; Holmgren, Hanna; Kronbichler, Martin; Ge, Anthony; Brant, Luca

    2017-11-01

    The conventional no-slip boundary condition leads to a non-integrable stress singularity at a moving contact line. This makes numerical simulations of two-phase flow challenging, especially when capillarity of the contact point is essential for the dynamics of the flow. We will describe a modeling methodology, which is suitable for numerical simulations, and present results from numerical computations. The methodology is based on combining a relation between the apparent contact angle and the contact line velocity, with the similarity solution for Stokes flow at a planar interface. The relation between angle and velocity can be determined by theoretical arguments, or from simulations using a more detailed model. In our approach we have used results from phase field simulations in a small domain, but using a molecular dynamics model should also be possible. In both cases more physics is included and the stress singularity is removed.

  16. Simulation as a surgical teaching model.

    PubMed

    Ruiz-Gómez, José Luis; Martín-Parra, José Ignacio; González-Noriega, Mónica; Redondo-Figuero, Carlos Godofredo; Manuel-Palazuelos, José Carlos

    2018-01-01

    Teaching of surgery has been affected by many factors over the last years, such as the reduction of working hours, the optimization of the use of the operating room or patient safety. Traditional teaching methodology fails to reduce the impact of these factors on surgeońs training. Simulation as a teaching model minimizes such impact, and is more effective than traditional teaching methods for integrating knowledge and clinical-surgical skills. Simulation complements clinical assistance with training, creating a safe learning environment where patient safety is not affected, and ethical or legal conflicts are avoided. Simulation uses learning methodologies that allow teaching individualization, adapting it to the learning needs of each student. It also allows training of all kinds of technical, cognitive or behavioural skills. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. Methodology based on genetic heuristics for in-vivo characterizing the patient-specific biomechanical behavior of the breast tissues

    PubMed Central

    Lago, M. A.; Rúperez, M. J.; Martínez-Martínez, F.; Martínez-Sanchis, S.; Bakic, P. R.; Monserrat, C.

    2015-01-01

    This paper presents a novel methodology to in-vivo estimate the elastic constants of a constitutive model proposed to characterize the mechanical behavior of the breast tissues. An iterative search algorithm based on genetic heuristics was constructed to in-vivo estimate these parameters using only medical images, thus avoiding invasive measurements of the mechanical response of the breast tissues. For the first time, a combination of overlap and distance coefficients were used for the evaluation of the similarity between a deformed MRI of the breast and a simulation of that deformation. The methodology was validated using breast software phantoms for virtual clinical trials, compressed to mimic MRI-guided biopsies. The biomechanical model chosen to characterize the breast tissues was an anisotropic neo-Hookean hyperelastic model. Results from this analysis showed that the algorithm is able to find the elastic constants of the constitutive equations of the proposed model with a mean relative error of about 10%. Furthermore, the overlap between the reference deformation and the simulated deformation was of around 95% showing the good performance of the proposed methodology. This methodology can be easily extended to characterize the real biomechanical behavior of the breast tissues, which means a great novelty in the field of the simulation of the breast behavior for applications such as surgical planing, surgical guidance or cancer diagnosis. This reveals the impact and relevance of the presented work. PMID:27103760

  18. Methodology based on genetic heuristics for in-vivo characterizing the patient-specific biomechanical behavior of the breast tissues.

    PubMed

    Lago, M A; Rúperez, M J; Martínez-Martínez, F; Martínez-Sanchis, S; Bakic, P R; Monserrat, C

    2015-11-30

    This paper presents a novel methodology to in-vivo estimate the elastic constants of a constitutive model proposed to characterize the mechanical behavior of the breast tissues. An iterative search algorithm based on genetic heuristics was constructed to in-vivo estimate these parameters using only medical images, thus avoiding invasive measurements of the mechanical response of the breast tissues. For the first time, a combination of overlap and distance coefficients were used for the evaluation of the similarity between a deformed MRI of the breast and a simulation of that deformation. The methodology was validated using breast software phantoms for virtual clinical trials, compressed to mimic MRI-guided biopsies. The biomechanical model chosen to characterize the breast tissues was an anisotropic neo-Hookean hyperelastic model. Results from this analysis showed that the algorithm is able to find the elastic constants of the constitutive equations of the proposed model with a mean relative error of about 10%. Furthermore, the overlap between the reference deformation and the simulated deformation was of around 95% showing the good performance of the proposed methodology. This methodology can be easily extended to characterize the real biomechanical behavior of the breast tissues, which means a great novelty in the field of the simulation of the breast behavior for applications such as surgical planing, surgical guidance or cancer diagnosis. This reveals the impact and relevance of the presented work.

  19. Teaching Behavioral Modeling and Simulation Techniques for Power Electronics Courses

    ERIC Educational Resources Information Center

    Abramovitz, A.

    2011-01-01

    This paper suggests a pedagogical approach to teaching the subject of behavioral modeling of switch-mode power electronics systems through simulation by general-purpose electronic circuit simulators. The methodology is oriented toward electrical engineering (EE) students at the undergraduate level, enrolled in courses such as "Power…

  20. Introducing Simulation via the Theory of Records

    ERIC Educational Resources Information Center

    Johnson, Arvid C.

    2011-01-01

    While spreadsheet simulation can be a useful method by which to help students to understand some of the more advanced concepts in an introductory statistics course, introducing the simulation methodology at the same time as these concepts can result in student cognitive overload. This article describes a spreadsheet model that has been…

  1. Modeling ground-based timber harvesting systems using computer simulation

    Treesearch

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  2. Overview of Computer Simulation Modeling Approaches and Methods

    Treesearch

    Robert E. Manning; Robert M. Itami; David N. Cole; Randy Gimblett

    2005-01-01

    The field of simulation modeling has grown greatly with recent advances in computer hardware and software. Much of this work has involved large scientific and industrial applications for which substantial financial resources are available. However, advances in object-oriented programming and simulation methodology, concurrent with dramatic increases in computer...

  3. Enhancing Students' Employability through Business Simulation

    ERIC Educational Resources Information Center

    Avramenko, Alex

    2012-01-01

    Purpose: The purpose of this paper is to introduce an approach to business simulation with less dependence on business simulation software to provide innovative work experience within a programme of study, to boost students' confidence and employability. Design/methodology/approach: The paper is based on analysis of existing business simulation…

  4. In Search of Effective Methodology for Organizational Learning: A Japanese Experience

    ERIC Educational Resources Information Center

    Tsuchiya, Shigehisa

    2011-01-01

    The author's personal journey regarding simulation and gaming started about 25 years ago when he happened to realize how powerful computerized simulation could be for organizational change. The metaphors created by computerized simulation enabled him to transform a stagnant university into a high-performance organization. Through extensive…

  5. Terminological Ambiguity: Game and Simulation

    ERIC Educational Resources Information Center

    Klabbers, Jan H. G.

    2009-01-01

    Since its introduction in academia and professional practice during the 1950s, gaming has been linked to simulation. Although both fields have a few important characteristics in common, they are distinct in their form and underlying theories of knowledge and methodology. Nevertheless, in the literature, hybrid terms such as "gaming/simulation" and…

  6. Teaching Camera Calibration by a Constructivist Methodology

    ERIC Educational Resources Information Center

    Samper, D.; Santolaria, J.; Pastor, J. J.; Aguilar, J. J.

    2010-01-01

    This article describes the Metrovisionlab simulation software and practical sessions designed to teach the most important machine vision camera calibration aspects in courses for senior undergraduate students. By following a constructivist methodology, having received introductory theoretical classes, students use the Metrovisionlab application to…

  7. Accurate Simulation of MPPT Methods Performance When Applied to Commercial Photovoltaic Panels

    PubMed Central

    2015-01-01

    A new, simple, and quick-calculation methodology to obtain a solar panel model, based on the manufacturers' datasheet, to perform MPPT simulations, is described. The method takes into account variations on the ambient conditions (sun irradiation and solar cells temperature) and allows fast MPPT methods comparison or their performance prediction when applied to a particular solar panel. The feasibility of the described methodology is checked with four different MPPT methods applied to a commercial solar panel, within a day, and under realistic ambient conditions. PMID:25874262

  8. 3CE Methodology for Conducting a Modeling, Simulation, and Instrumentation Tool Capability Analysis

    DTIC Science & Technology

    2010-05-01

    flRmurn I F )T:Ir,tir)l! MCr)lto.-lng DHin nttbli..’"Ollc:~ E,;m:a..liut .!,)’l’lt’Mn:l’lll.ll~ t Managemen t F unction a l Arem 1 .5 Toola na...a modeling, simulation, and instrumentation (MS&I) environment. This methodology uses the DoDAF product set to document operational and systems...engineering process were identified and resolved, such as duplication of data elements derived from DoDAF operational and system views used to

  9. Accurate simulation of MPPT methods performance when applied to commercial photovoltaic panels.

    PubMed

    Cubas, Javier; Pindado, Santiago; Sanz-Andrés, Ángel

    2015-01-01

    A new, simple, and quick-calculation methodology to obtain a solar panel model, based on the manufacturers' datasheet, to perform MPPT simulations, is described. The method takes into account variations on the ambient conditions (sun irradiation and solar cells temperature) and allows fast MPPT methods comparison or their performance prediction when applied to a particular solar panel. The feasibility of the described methodology is checked with four different MPPT methods applied to a commercial solar panel, within a day, and under realistic ambient conditions.

  10. Educational Game Development Approach to a particular case: the donor's evaluation.

    PubMed

    Borro Escribano, B; del Blanco, A; Torrente, J; Borro Mate, J M; Fernandez Manjon, B

    2015-01-01

    Serious games are a current trend nowadays. Almost every sector has used serious games in recent years for different educational purposes. The eLearning research team of the Complutense University of Madrid main focus of research is the development of low-cost serious games. During the past 10 years, we have been working with and developing serious games, paying special attention to those related to healthcare. From all these studies, a methodology was defined-the Educational Game Development Approach (EGDA)-to design, develop, and evaluate game-like simulations or serious games in healthcare. We present the application of the EGDA to a particular case, the development of a serious game representing the donor's evaluation in an intensive care unit from the point of view of a hospital coordinator following the EGDA methodology. In this simulation, we changed the strategy of selection of teaching cases by exponentially increasing the number of teaching cases. This kind of educational content provides several benefits to students as they learn while playing; they receive immediate feedback of mistakes and correct moves and an objective assessment. These simulations allow the students to practice in a risk-free environment. Moreover, the addition of game elements increases engagement and promotes the retention of important information. A game-like simulation has been developed through the use of this methodology. This simulation represents a complex medical procedure. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    NASA Astrophysics Data System (ADS)

    Izzuddin, Nur; Sunarsih, Priyanto, Agoes

    2015-05-01

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the target vessel's speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel's speed to obtain better characteristics and hence optimize the fuel saving rate.

  12. Simulation of investment returns of toll projects.

    DOT National Transportation Integrated Search

    2013-08-01

    This research develops a methodological framework to illustrate key stages in applying the simulation of investment returns of toll projects, acting as an example process of helping agencies conduct numerical risk analysis by taking certain uncertain...

  13. Combination and selection of traffic safety expert judgments for the prevention of driving risks.

    PubMed

    Cabello, Enrique; Conde, Cristina; de Diego, Isaac Martín; Moguerza, Javier M; Redchuk, Andrés

    2012-11-02

    In this paper, we describe a new framework to combine experts’ judgments for the prevention of driving risks in a cabin truck. In addition, the methodology shows how to choose among the experts the one whose predictions fit best the environmental conditions. The methodology is applied over data sets obtained from a high immersive cabin truck simulator in natural driving conditions. A nonparametric model, based in Nearest Neighbors combined with Restricted Least Squared methods is developed. Three experts were asked to evaluate the driving risk using a Visual Analog Scale (VAS), in order to measure the driving risk in a truck simulator where the vehicle dynamics factors were stored. Numerical results show that the methodology is suitable for embedding in real time systems.

  14. Harvesting model uncertainty for the simulation of interannual variability

    NASA Astrophysics Data System (ADS)

    Misra, Vasubandhu

    2009-08-01

    An innovative modeling strategy is introduced to account for uncertainty in the convective parameterization (CP) scheme of a coupled ocean-atmosphere model. The methodology involves calling the CP scheme several times at every given time step of the model integration to pick the most probable convective state. Each call of the CP scheme is unique in that one of its critical parameter values (which is unobserved but required by the scheme) is chosen randomly over a given range. This methodology is tested with the relaxed Arakawa-Schubert CP scheme in the Center for Ocean-Land-Atmosphere Studies (COLA) coupled general circulation model (CGCM). Relative to the control COLA CGCM, this methodology shows improvement in the El Niño-Southern Oscillation simulation and the Indian summer monsoon precipitation variability.

  15. USGS Methodology for Assessing Continuous Petroleum Resources

    USGS Publications Warehouse

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  16. Combining users' activity survey and simulators to evaluate human activity recognition systems.

    PubMed

    Azkune, Gorka; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming

    2015-04-08

    Evaluating human activity recognition systems usually implies following expensive and time-consuming methodologies, where experiments with humans are run with the consequent ethical and legal issues. We propose a novel evaluation methodology to overcome the enumerated problems, which is based on surveys for users and a synthetic dataset generator tool. Surveys allow capturing how different users perform activities of daily living, while the synthetic dataset generator is used to create properly labelled activity datasets modelled with the information extracted from surveys. Important aspects, such as sensor noise, varying time lapses and user erratic behaviour, can also be simulated using the tool. The proposed methodology is shown to have very important advantages that allow researchers to carry out their work more efficiently. To evaluate the approach, a synthetic dataset generated following the proposed methodology is compared to a real dataset computing the similarity between sensor occurrence frequencies. It is concluded that the similarity between both datasets is more than significant.

  17. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, suchmore » as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.« less

  18. Modeling Single-Event Transient Propagation in a SiGe BiCMOS Direct-Conversion Receiver

    NASA Astrophysics Data System (ADS)

    Ildefonso, Adrian; Song, Ickhyun; Tzintzarov, George N.; Fleetwood, Zachary E.; Lourenco, Nelson E.; Wachter, Mason T.; Cressler, John D.

    2017-08-01

    The propagation of single-event transient (SET) signals in a silicon-germanium direct-conversion receiver carrying modulated data is explored. A theoretical analysis of transient propagation, verified by simulation, is presented. A new methodology to characterize and quantify the impact of SETs in communication systems carrying modulated data is proposed. The proposed methodology uses a pulsed radiation source to induce distortions in the signal constellation. The error vector magnitude due to SETs can then be calculated to quantify errors. Two different modulation schemes were simulated: QPSK and 16-QAM. The distortions in the constellation diagram agree with the presented circuit theory. Furthermore, the proposed methodology was applied to evaluate the improvements in the SET response due to a known radiation-hardening-by-design (RHBD) technique, where the common-base device of the low-noise amplifier was operated in inverse mode. The proposed methodology can be a valid technique to determine the most sensitive parts of a system carrying modulated data.

  19. Design of feedback control systems for stable plants with saturating actuators

    NASA Technical Reports Server (NTRS)

    Kapasouris, Petros; Athans, Michael; Stein, Gunter

    1988-01-01

    A systematic control design methodology is introduced for multi-input/multi-output stable open loop plants with multiple saturations. This new methodology is a substantial improvement over previous heuristic single-input/single-output approaches. The idea is to introduce a supervisor loop so that when the references and/or disturbances are sufficiently small, the control system operates linearly as designed. For signals large enough to cause saturations, the control law is modified in such a way as to ensure stability and to preserve, to the extent possible, the behavior of the linear control design. Key benefits of the methodology are: the modified compensator never produces saturating control signals, integrators and/or slow dynamics in the compensator never windup, the directional properties of the controls are maintained, and the closed loop system has certain guaranteed stability properties. The advantages of the new design methodology are illustrated in the simulation of an academic example and the simulation of the multivariable longitudinal control of a modified model of the F-8 aircraft.

  20. Force on Force Modeling with Formal Task Structures and Dynamic Geometry

    DTIC Science & Technology

    2017-03-24

    task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test

  1. The SIMRAND methodology - Simulation of Research and Development Projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  2. Modelling of the material flow of Nd-Fe-B magnets under high temperature deformation via finite element simulation method

    PubMed Central

    Chen, Yen-Ju; Lee, Yen-I; Chang, Wen-Cheng; Hsiao, Po-Jen; You, Jr-Shian; Wang, Chun-Chieh; Wei, Chia-Min

    2017-01-01

    Abstract Hot deformation of Nd-Fe-B magnets has been studied for more than three decades. With a good combination of forming processing parameters, the remanence and (BH)max values of Nd-Fe-B magnets could be greatly increased due to the formation of anisotropic microstructures during hot deformation. In this work, a methodology is proposed for visualizing the material flow in hot-deformed Nd-Fe-B magnets via finite element simulation. Material flow in hot-deformed Nd-Fe-B magnets could be predicted by simulation, which fitted with experimental results. By utilizing this methodology, the correlation between strain distribution and magnetic properties enhancement could be better understood. PMID:28970869

  3. An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.

    PubMed

    Nguyen, Ngan; Watson, William D; Dominguez, Edward

    2016-01-01

    Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4) it provides formative and constructive feedback to bridge the gap between the learners' KSAs and the targeted KSAs. The EBAT methodology guides the design of simulation that incorporates these 4 features and, thus, enhances training effectiveness with simulation. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  4. A lava flow simulation model for the development of volcanic hazard maps for Mount Etna (Italy)

    NASA Astrophysics Data System (ADS)

    Damiani, M. L.; Groppelli, G.; Norini, G.; Bertino, E.; Gigliuto, A.; Nucita, A.

    2006-05-01

    Volcanic hazard assessment is of paramount importance for the safeguard of the resources exposed to volcanic hazards. In the paper we present ELFM, a lava flow simulation model for the evaluation of the lava flow hazard on Mount Etna (Sicily, Italy), the most important active volcano in Europe. The major contributions of the paper are: (a) a detailed specification of the lava flow simulation model and the specification of an algorithm implementing it; (b) the definition of a methodological framework for applying the model to the specific volcano. For what concerns the former issue, we propose an extended version of an existing stochastic model that has been applied so far only to the assessment of the volcanic hazard on Lanzarote and Tenerife (Canary Islands). Concerning the methodological framework, we claim model validation is definitely needed for assessing the effectiveness of the lava flow simulation model. To that extent a strategy has been devised for the generation of simulation experiments and evaluation of their outcomes.

  5. An ultrasonic methodology for muscle cross section measurement of support space flight

    NASA Astrophysics Data System (ADS)

    Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.

    2004-09-01

    The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal output of the simulation tool used as a methodology validation for actual tissue signals. The disturbance patterns of the simulated and sampled waveforms were consistent. Although only discussed as a small part of the work presented, the sampling portion also helped identify important considerations such as tissue compression and transducer positioning for future work involving tissue scanning with this methodology.

  6. Management of health care expenditure by soft computing methodology

    NASA Astrophysics Data System (ADS)

    Maksimović, Goran; Jović, Srđan; Jovanović, Radomir; Aničić, Obrad

    2017-01-01

    In this study was managed the health care expenditure by soft computing methodology. The main goal was to predict the gross domestic product (GDP) according to several factors of health care expenditure. Soft computing methodologies were applied since GDP prediction is very complex task. The performances of the proposed predictors were confirmed with the simulation results. According to the results, support vector regression (SVR) has better prediction accuracy compared to other soft computing methodologies. The soft computing methods benefit from the soft computing capabilities of global optimization in order to avoid local minimum issues.

  7. User's Guide for a Computerized Track Maintenance Simulation Cost Methodology

    DOT National Transportation Integrated Search

    1982-02-01

    This User's Guide describes the simulation cost modeling technique developed for costing of maintenance operations of track and its component structures. The procedure discussed provides for separate maintenance cost entries to be associated with def...

  8. Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions

    NASA Technical Reports Server (NTRS)

    Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.

    2011-01-01

    A surrogate model methodology is described for predicting in real time the residual strength of flight structures with discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. A residual strength test of a metallic, integrally-stiffened panel is simulated to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data would, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high-fidelity fracture simulation framework provide useful tools for adaptive flight technology.

  9. Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions

    NASA Technical Reports Server (NTRS)

    Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.

    2011-01-01

    A surrogate model methodology is described for predicting, during flight, the residual strength of aircraft structures that sustain discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. Two ductile fracture simulations are presented to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data does, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high fidelity fracture simulation framework provide useful tools for adaptive flight technology.

  10. Introduction to Stochastic Simulations for Chemical and Physical Processes: Principles and Applications

    ERIC Educational Resources Information Center

    Weiss, Charles J.

    2017-01-01

    An introduction to digital stochastic simulations for modeling a variety of physical and chemical processes is presented. Despite the importance of stochastic simulations in chemistry, the prevalence of turn-key software solutions can impose a layer of abstraction between the user and the underlying approach obscuring the methodology being…

  11. From Know How to Do Now: Instructional Applications of Simulated Interactions within Teacher Education

    ERIC Educational Resources Information Center

    Dotger, Benjamin H.

    2011-01-01

    The induction years of teaching challenge novice educators to quickly transition from what they learned as teacher candidates into what they can do as emerging professionals. This article outlines a simulated interaction methodology to help bridge teacher preparation and practice. Building from examples of simulated interactions between teacher…

  12. Where Are We? An Analysis of the Methods and Focus of the Research on Simulation Gaming.

    ERIC Educational Resources Information Center

    Butler, Richard J.; And Others

    1988-01-01

    Designed to determine whether research in simulation and gaming follows research design methodology and the degree to which research is directed toward learning outcomes measured by Bloom's taxonomy of educational objectives, this article examines studies reported in proceedings from the Association for Business Simulation and Experiential…

  13. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  14. Solute transport with equilibrium aqueous complexation and either sorption or ion exchange: Simulation methodology and applications

    USGS Publications Warehouse

    Lewis, F.M.; Voss, C.I.; Rubin, J.

    1987-01-01

    Methodologies that account for specific types of chemical reactions in the simulation of solute transport can be developed so they are compatible with solution algorithms employed in existing transport codes. This enables the simulation of reactive transport in complex multidimensional flow regimes, and provides a means for existing codes to account for some of the fundamental chemical processes that occur among transported solutes. Two equilibrium-controlled reaction systems demonstrate a methodology for accommodating chemical interaction into models of solute transport. One system involves the sorption of a given chemical species, as well as two aqueous complexations in which the sorbing species is a participant. The other reaction set involves binary ion exchange coupled with aqueous complexation involving one of the exchanging species. The methodology accommodates these reaction systems through the addition of nonlinear terms to the transport equations for the sorbing species. Example simulation results show (1) the effect equilibrium chemical parameters have on the spatial distributions of concentration for complexing solutes; (2) that an interrelationship exists between mechanical dispersion and the various reaction processes; (3) that dispersive parameters of the porous media cannot be determined from reactive concentration distributions unless the reaction is accounted for or the influence of the reaction is negligible; (4) how the concentration of a chemical species may be significantly affected by its participation in an aqueous complex with a second species which also sorbs; and (5) that these coupled chemical processes influencing reactive transport can be demonstrated in two-dimensional flow regimes. ?? 1987.

  15. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care

    PubMed Central

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988

  16. A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.

    PubMed

    Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis

    2017-01-01

    This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.

  17. Randomized controlled trials of simulation-based interventions in Emergency Medicine: a methodological review.

    PubMed

    Chauvin, Anthony; Truchot, Jennifer; Bafeta, Aida; Pateron, Dominique; Plaisance, Patrick; Yordanov, Youri

    2018-04-01

    The number of trials assessing Simulation-Based Medical Education (SBME) interventions has rapidly expanded. Many studies show that potential flaws in design, conduct and reporting of randomized controlled trials (RCTs) can bias their results. We conducted a methodological review of RCTs assessing a SBME in Emergency Medicine (EM) and examined their methodological characteristics. We searched MEDLINE via PubMed for RCT that assessed a simulation intervention in EM, published in 6 general and internal medicine and in the top 10 EM journals. The Cochrane Collaboration risk of Bias tool was used to assess risk of bias, intervention reporting was evaluated based on the "template for intervention description and replication" checklist, and methodological quality was evaluated by the Medical Education Research Study Quality Instrument. Reports selection and data extraction was done by 2 independents researchers. From 1394 RCTs screened, 68 trials assessed a SBME intervention. They represent one quarter of our sample. Cardiopulmonary resuscitation (CPR) is the most frequent topic (81%). Random sequence generation and allocation concealment were performed correctly in 66 and 49% of trials. Blinding of participants and assessors was performed correctly in 19 and 68%. Risk of attrition bias was low in three-quarters of the studies (n = 51). Risk of selective reporting bias was unclear in nearly all studies. The mean MERQSI score was of 13.4/18.4% of the reports provided a description allowing the intervention replication. Trials assessing simulation represent one quarter of RCTs in EM. Their quality remains unclear, and reproducing the interventions appears challenging due to reporting issues.

  18. An Improved Response Surface Methodology Algorithm with an Application to Traffic Signal Optimization for Urban Networks

    DOT National Transportation Integrated Search

    1995-01-01

    Prepared ca. 1995. This paper illustrates the use of the simulation-optimization technique of response surface methodology (RSM) in traffic signal optimization of urban networks. It also quantifies the gains of using the common random number (CRN) va...

  19. Analyzing simulation-based PRA data through traditional and topological clustering: A BWR station blackout case study

    DOE PAGES

    Maljovec, D.; Liu, S.; Wang, B.; ...

    2015-07-14

    Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less

  20. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    USGS Publications Warehouse

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  1. Novel thermal management system design methodology for power lithium-ion battery

    NASA Astrophysics Data System (ADS)

    Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro

    2014-12-01

    Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.

  2. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  3. Use of the AHP methodology in system dynamics: Modelling and simulation for health technology assessments to determine the correct prosthesis choice for hernia diseases.

    PubMed

    Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela

    2018-05-01

    Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Predicting trends of invasive plants richness using local socio-economic data: An application in North Portugal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos, Mario, E-mail: mgsantoss@gmail.com; Freitas, Raul, E-mail: raulfreitas@portugalmail.com; Crespi, Antonio L., E-mail: aluis.crespi@gmail.com

    2011-10-15

    This study assesses the potential of an integrated methodology for predicting local trends in invasive exotic plant species (invasive richness) using indirect, regional information on human disturbance. The distribution of invasive plants was assessed in North Portugal using herbarium collections and local environmental, geophysical and socio-economic characteristics. Invasive richness response to anthropogenic disturbance was predicted using a dynamic model based on a sequential modeling process (stochastic dynamic methodology-StDM). Derived scenarios showed that invasive richness trends were clearly associated with ongoing socio-economic change. Simulations including scenarios of growing urbanization showed an increase in invasive richness while simulations in municipalities with decreasingmore » populations showed stable or decreasing levels of invasive richness. The model simulations demonstrate the interest and feasibility of using this methodology in disturbance ecology. - Highlights: {yields} Socio-economic data indicate human induced disturbances. {yields} Socio-economic development increase disturbance in ecosystems. {yields} Disturbance promotes opportunities for invasive plants.{yields} Increased opportunities promote richness of invasive plants.{yields} Increase in richness of invasive plants change natural ecosystems.« less

  5. Numerical simulation of the SAGD process coupled with geomechanical behavior

    NASA Astrophysics Data System (ADS)

    Li, Pingke

    Canada has vast oil sand resources. While a large portion of this resource can be recovered by surface mining techniques, a majority is located at depths requiring the application of in situ recovery technologies. Although a number of in situ recovery technologies exist, the steam assisted gravity drainage (SAGD) process has emerged as one of the most promising technologies to develop the in situ oil sands resources. During the SAGD operations, saturated steam is continuously injected into the oil sands reservoir, which induces pore pressure and stress variations. As a result, reservoir parameters and processes may also vary, particularly when tensile and shear failure occur. This geomechanical effect is obvious for oil sands material because oil sands have the in situ interlocked fabric. The conventional reservoir simulation generally does not take this coupled mechanism into consideration. Therefore, this research is to improve the reservoir simulation techniques of the SAGD process applied in the development of oil sands and heavy oil reservoirs. The analyses of the decoupled reservoir geomechanical simulation results show that the geomechanical behavior in SAGD has obvious impact on reservoir parameters, such as absolute permeability. The issues with the coupled reservoir geomechanical simulations of the SAGD process have been clarified and the permeability variations due to geomechanical behaviors in the SAGD process investigated. A methodology of sequentially coupled reservoir geomechanical simulation technique was developed based on the reservoir simulator, EXOTHERM, and the geomechanical simulator, FLAC. In addition, a representative geomechanical model of oil sands material was summarized in this research. Finally, this reservoir geomechanical simulation methodology was verified with the UTF Phase A SAGD project and applied in a SAGD operation with gas-over-bitumen geometry. Based on this methodology, the geomechanical effect on the SAGD production performance can be quantified. This research program involves the analyses of laboratory testing results obtained from literatures. However, no laboratory testing was conducted in the process of this research.

  6. Statistical power calculations for mixed pharmacokinetic study designs using a population approach.

    PubMed

    Kloprogge, Frank; Simpson, Julie A; Day, Nicholas P J; White, Nicholas J; Tarning, Joel

    2014-09-01

    Simultaneous modelling of dense and sparse pharmacokinetic data is possible with a population approach. To determine the number of individuals required to detect the effect of a covariate, simulation-based power calculation methodologies can be employed. The Monte Carlo Mapped Power method (a simulation-based power calculation methodology using the likelihood ratio test) was extended in the current study to perform sample size calculations for mixed pharmacokinetic studies (i.e. both sparse and dense data collection). A workflow guiding an easy and straightforward pharmacokinetic study design, considering also the cost-effectiveness of alternative study designs, was used in this analysis. Initially, data were simulated for a hypothetical drug and then for the anti-malarial drug, dihydroartemisinin. Two datasets (sampling design A: dense; sampling design B: sparse) were simulated using a pharmacokinetic model that included a binary covariate effect and subsequently re-estimated using (1) the same model and (2) a model not including the covariate effect in NONMEM 7.2. Power calculations were performed for varying numbers of patients with sampling designs A and B. Study designs with statistical power >80% were selected and further evaluated for cost-effectiveness. The simulation studies of the hypothetical drug and the anti-malarial drug dihydroartemisinin demonstrated that the simulation-based power calculation methodology, based on the Monte Carlo Mapped Power method, can be utilised to evaluate and determine the sample size of mixed (part sparsely and part densely sampled) study designs. The developed method can contribute to the design of robust and efficient pharmacokinetic studies.

  7. Multibody dynamic simulation of knee contact mechanics

    PubMed Central

    Bei, Yanhong; Fregly, Benjamin J.

    2006-01-01

    Multibody dynamic musculoskeletal models capable of predicting muscle forces and joint contact pressures simultaneously would be valuable for studying clinical issues related to knee joint degeneration and restoration. Current three-dimensional multi-body knee models are either quasi-static with deformable contact or dynamic with rigid contact. This study proposes a computationally efficient methodology for combining multibody dynamic simulation methods with a deformable contact knee model. The methodology requires preparation of the articular surface geometry, development of efficient methods to calculate distances between contact surfaces, implementation of an efficient contact solver that accounts for the unique characteristics of human joints, and specification of an application programming interface for integration with any multibody dynamic simulation environment. The current implementation accommodates natural or artificial tibiofemoral joint models, small or large strain contact models, and linear or nonlinear material models. Applications are presented for static analysis (via dynamic simulation) of a natural knee model created from MRI and CT data and dynamic simulation of an artificial knee model produced from manufacturer’s CAD data. Small and large strain natural knee static analyses required 1 min of CPU time and predicted similar contact conditions except for peak pressure, which was higher for the large strain model. Linear and nonlinear artificial knee dynamic simulations required 10 min of CPU time and predicted similar contact force and torque but different contact pressures, which were lower for the nonlinear model due to increased contact area. This methodology provides an important step toward the realization of dynamic musculoskeletal models that can predict in vivo knee joint motion and loading simultaneously. PMID:15564115

  8. Nonlinear maneuver autopilot for the F-15 aircraft

    NASA Technical Reports Server (NTRS)

    Menon, P. K. A.; Badgett, M. E.; Walker, R. A.

    1989-01-01

    A methodology is described for the development of flight test trajectory control laws based on singular perturbation methodology and nonlinear dynamic modeling. The control design methodology is applied to a detailed nonlinear six degree-of-freedom simulation of the F-15 and results for a level accelerations, pushover/pullup maneuver, zoom and pushover maneuver, excess thrust windup turn, constant thrust windup turn, and a constant dynamic pressure/constant load factor trajectory are presented.

  9. A Novel Methodology for the Simulation of Athletic Tasks on Cadaveric Knee Joints with Respect to In Vivo Kinematics

    PubMed Central

    Bates, Nathaniel A.; Nesbitt, Rebecca J.; Shearn, Jason T.; Myer, Gregory D.; Hewett, Timothy E.

    2015-01-01

    Six degree of freedom (6-DOF) robotic manipulators have simulated clinical tests and gait on cadaveric knees to examine knee biomechanics. However, these activities do not necessarily emulate the kinematics and kinetics that lead to anterior cruciate ligament (ACL) rupture. The purpose of this study was to determine the techniques needed to derive reproducible, in vitro simulations from in vivo skin-marker kinematics recorded during simulated athletic tasks. Input of raw, in vivo, skin-marker-derived motion capture kinematics consistently resulted in specimen failure. The protocol described in this study developed an in-depth methodology to adapt in vivo kinematic recordings into 6-DOF knee motion simulations for drop vertical jumps and sidestep cutting. Our simulation method repeatably produced kinetics consistent with vertical ground reaction patterns while preserving specimen integrity. Athletic task simulation represents an advancement that allows investigators to examine ACL-intact and graft biomechanics during motions that generate greater kinetics, and the athletic tasks are more representative of documented cases of ligament rupture. Establishment of baseline functional mechanics within the knee joint during athletic tasks will serve to advance the prevention, repair and rehabilitation of ACL injuries. PMID:25869454

  10. End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics

    NASA Technical Reports Server (NTRS)

    Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.

    2012-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.

  11. An investigation of human body model morphing for the assessment of abdomen responses to impact against a population of test subjects.

    PubMed

    Beillas, Philippe; Berthet, Fabien

    2017-05-29

    Human body models have the potential to better describe the human anatomy and variability than dummies. However, data sets available to verify the human response to impact are typically limited in numbers, and they are not size or gender specific. The objective of this study was to investigate the use of model morphing methodologies within that context. In this study, a simple human model scaling methodology was developed to morph two detailed human models (Global Human Body Model Consortium models 50th male, M50, and 5th female, F05) to the dimensions of post mortem human surrogates (PMHS) used in published literature. The methodology was then successfully applied to 52 PMHS tested in 14 impact conditions loading the abdomen. The corresponding 104 simulations were compared to the responses of the PMHS and to the responses of the baseline models without scaling (28 simulations). The responses were analysed using the CORA method and peak values. The results suggest that model scaling leads to an improvement of the predicted force and deflection but has more marginal effects on the predicted abdominal compressions. M50 and F05 models scaled to the same PMHS were also found to have similar external responses, but large differences were found between the two sets of models for the strain energy densities in the liver and the spleen for mid-abdomen impact simulations. These differences, which were attributed to the anatomical differences in the abdomen of the baseline models, highlight the importance of the selection of the impact condition for simulation studies, especially if the organ location is not known in the test. While the methodology could be further improved, it shows the feasibility of using model scaling methodologies to compare human models of different sizes and to evaluate scaling approaches within the context of human model validation.

  12. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  13. Traffic analysis toolbox volume XIII : integrated corridor management analysis, modeling, and simulation guide.

    DOT National Transportation Integrated Search

    2017-02-01

    As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...

  14. High-fidelity large eddy simulation for supersonic jet noise prediction

    NASA Astrophysics Data System (ADS)

    Aikens, Kurt M.

    The problem of intense sound radiation from supersonic jets is a concern for both civil and military applications. As a result, many experimental and computational efforts are focused at evaluating possible noise suppression techniques. Large-eddy simulation (LES) is utilized in many computational studies to simulate the turbulent jet flowfield. Integral methods such as the Ffowcs Williams-Hawkings (FWH) method are then used for propagation of the sound waves to the farfield. Improving the accuracy of this two-step methodology and evaluating beveled converging-diverging nozzles for noise suppression are the main tasks of this work. First, a series of numerical experiments are undertaken to ensure adequate numerical accuracy of the FWH methodology. This includes an analysis of different treatments for the downstream integration surface: with or without including an end-cap, averaging over multiple end-caps, and including an approximate surface integral correction term. Secondly, shock-capturing methods based on characteristic filtering and adaptive spatial filtering are used to extend a highly-parallelizable multiblock subsonic LES code to enable simulations of supersonic jets. The code is based on high-order numerical methods for accurate prediction of the acoustic sources and propagation of the sound waves. Furthermore, this new code is more efficient than the legacy version, allows cylindrical multiblock topologies, and is capable of simulating nozzles with resolved turbulent boundary layers when coupled with an approximate turbulent inflow boundary condition. Even though such wall-resolved simulations are more physically accurate, their expense is often prohibitive. To make simulations more economical, a wall model is developed and implemented. The wall modeling methodology is validated for turbulent quasi-incompressible and compressible zero pressure gradient flat plate boundary layers, and for subsonic and supersonic jets. The supersonic code additions and the wall model treatment are then utilized to simulate military-style nozzles with and without beveling of the nozzle exit plane. Experiments of beveled converging-diverging nozzles have found reduced noise levels for some observer locations. Predicting the noise for these geometries provides a good initial test of the overall methodology for a more complex nozzle. The jet flowfield and acoustic data are analyzed and compared to similar experiments and excellent agreement is found. Potential areas of improvement are discussed for future research.

  15. SIMCA T 1.0: A SAS Computer Program for Simulating Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Raiche, Gilles; Blais, Jean-Guy

    2006-01-01

    Monte Carlo methodologies are frequently applied to study the sampling distribution of the estimated proficiency level in adaptive testing. These methods eliminate real situational constraints. However, these Monte Carlo methodologies are not currently supported by the available software programs, and when these programs are available, their…

  16. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  17. Engine dynamic analysis with general nonlinear finite element codes. II - Bearing element implementation, overall numerical characteristics and benchmarking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.

    1982-01-01

    Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.

  18. Aggregated N-of-1 randomized controlled trials: modern data analytics applied to a clinically valid method of intervention effectiveness.

    PubMed

    Cushing, Christopher C; Walters, Ryan W; Hoffman, Lesa

    2014-03-01

    Aggregated N-of-1 randomized controlled trials (RCTs) combined with multilevel modeling represent a methodological advancement that may help bridge science and practice in pediatric psychology. The purpose of this article is to offer a primer for pediatric psychologists interested in conducting aggregated N-of-1 RCTs. An overview of N-of-1 RCT methodology is provided and 2 simulated data sets are analyzed to demonstrate the clinical and research potential of the methodology. The simulated data example demonstrates the utility of aggregated N-of-1 RCTs for understanding the clinical impact of an intervention for a given individual and the modeling of covariates to explain why an intervention worked for one patient and not another. Aggregated N-of-1 RCTs hold potential for improving the science and practice of pediatric psychology.

  19. Modal interactions due to friction in the nonlinear vibration response of the "Harmony" test structure: Experiments and simulations

    NASA Astrophysics Data System (ADS)

    Claeys, M.; Sinou, J.-J.; Lambelin, J.-P.; Todeschini, R.

    2016-08-01

    The nonlinear vibration response of an assembly with friction joints - named "Harmony" - is studied both experimentally and numerically. The experimental results exhibit a softening effect and an increase of dissipation with excitation level. Modal interactions due to friction are also evidenced. The numerical methodology proposed groups together well-known structural dynamic methods, including finite elements, substructuring, Harmonic Balance and continuation methods. On the one hand, the application of this methodology proves its capacity to treat a complex system where several friction movements occur at the same time. On the other hand, the main contribution of this paper is the experimental and numerical study of evidence of modal interactions due to friction. The simulation methodology succeeds in reproducing complex form of dynamic behavior such as these modal interactions.

  20. Estimation of elastic moduli in a compressible Gibson half-space by inverting Rayleigh-wave phase velocity

    USGS Publications Warehouse

    Xia, J.; Xu, Y.; Miller, R.D.; Chen, C.

    2006-01-01

    A Gibson half-space model (a non-layered Earth model) has the shear modulus varying linearly with depth in an inhomogeneous elastic half-space. In a half-space of sedimentary granular soil under a geostatic state of initial stress, the density and the Poisson's ratio do not vary considerably with depth. In such an Earth body, the dynamic shear modulus is the parameter that mainly affects the dispersion of propagating waves. We have estimated shear-wave velocities in the compressible Gibson half-space by inverting Rayleigh-wave phase velocities. An analytical dispersion law of Rayleigh-type waves in a compressible Gibson half-space is given in an algebraic form, which makes our inversion process extremely simple and fast. The convergence of the weighted damping solution is guaranteed through selection of the damping factor using the Levenberg-Marquardt method. Calculation efficiency is achieved by reconstructing a weighted damping solution using singular value decomposition techniques. The main advantage of this algorithm is that only three parameters define the compressible Gibson half-space model. Theoretically, to determine the model by the inversion, only three Rayleigh-wave phase velocities at different frequencies are required. This is useful in practice where Rayleigh-wave energy is only developed in a limited frequency range or at certain frequencies as data acquired at manmade structures such as dams and levees. Two real examples are presented and verified by borehole S-wave velocity measurements. The results of these real examples are also compared with the results of the layered-Earth model. ?? Springer 2006.

  1. Comprehensive numerical methodology for direct numerical simulations of compressible Rayleigh-Taylor instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.

    A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.

  2. Physical Simulation of a Prolonged Plasma-Plume Exposure of a Space Debris Object

    NASA Astrophysics Data System (ADS)

    Shuvalov, V. A.; Gorev, N. B.; Tokmak, N. A.; Kochubei, G. S.

    2018-05-01

    A methodology has been developed for the physical (laboratory) simulation of the prolonged exposure of a space debris object to high-energy ions of a plasma plume for removing the object into low-Earth orbit with its subsequent burning in the Earth's atmosphere. The methodology is based on the equivalence criteria of two modes of exposure (in the Earth's ionosphere and in the setup) and the procedure for accelerated resource tests in terms of the sputtering of the space debris material and its deceleration by a plasma jet in the Earth's ionosphere.

  3. Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Malsbury, T.; Atencio, A., Jr.

    1992-01-01

    A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.

  4. Health Worker Focused Distributed Simulation for Improving Capability of Health Systems in Liberia.

    PubMed

    Gale, Thomas C E; Chatterjee, Arunangsu; Mellor, Nicholas E; Allan, Richard J

    2016-04-01

    The main goal of this study was to produce an adaptable learning platform using virtual learning and distributed simulation, which can be used to train health care workers, across a wide geographical area, key safety messages regarding infection prevention control (IPC). A situationally responsive agile methodology, Scrum, was used to develop a distributed simulation module using short 1-week iterations and continuous synchronous plus asynchronous communication including end users and IPC experts. The module contained content related to standard IPC precautions (including handwashing techniques) and was structured into 3 distinct sections related to donning, doffing, and hazard perception training. Using Scrum methodology, we were able to link concepts applied to best practices in simulation-based medical education (deliberate practice, continuous feedback, self-assessment, and exposure to uncommon events), pedagogic principles related to adult learning (clear goals, contextual awareness, motivational features), and key learning outcomes regarding IPC, as a rapid response initiative to the Ebola outbreak in West Africa. Gamification approach has been used to map learning mechanics to enhance user engagement. The developed IPC module demonstrates how high-frequency, low-fidelity simulations can be rapidly designed using scrum-based agile methodology. Analytics incorporated into the tool can help demonstrate improved confidence and competence of health care workers who are treating patients within an Ebola virus disease outbreak region. These concepts could be used in a range of evolving disasters where rapid development and communication of key learning messages are required.

  5. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    NASA Astrophysics Data System (ADS)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  6. The Use of Computer Simulation Gaming in Teaching Broadcast Economics.

    ERIC Educational Resources Information Center

    Mancuso, Louis C.

    The purpose of this study was to develop a broadcast economic computer simulation and to ascertain how a lecture-computer simulation game compared as a teaching method with a more traditional lecture and case study instructional methods. In each of three sections of a broadcast economics course, a different teaching methodology was employed: (1)…

  7. Simulation-Based Performance Assessment: An Innovative Approach to Exploring Understanding of Physical Science Concepts

    ERIC Educational Resources Information Center

    Gale, Jessica; Wind, Stefanie; Koval, Jayma; Dagosta, Joseph; Ryan, Mike; Usselman, Marion

    2016-01-01

    This paper illustrates the use of simulation-based performance assessment (PA) methodology in a recent study of eighth-grade students' understanding of physical science concepts. A set of four simulation-based PA tasks were iteratively developed to assess student understanding of an array of physical science concepts, including net force,…

  8. Application of control theory to dynamic systems simulation

    NASA Technical Reports Server (NTRS)

    Auslander, D. M.; Spear, R. C.; Young, G. E.

    1982-01-01

    The application of control theory is applied to dynamic systems simulation. Theory and methodology applicable to controlled ecological life support systems are considered. Spatial effects on system stability, design of control systems with uncertain parameters, and an interactive computing language (PARASOL-II) designed for dynamic system simulation, report quality graphics, data acquisition, and simple real time control are discussed.

  9. Identifying and Quantifying Emergent Behavior Through System of Systems Modeling and Simulation

    DTIC Science & Technology

    2015-09-01

    42 J . SUMMARY ..............................................................................................43 III. METHODOLOGY...our research. e. Ptolemy Ptolemy is a simulation and rapid prototype environment developed at the University of California Berkely in the...simulation. J . SUMMARY This chapter describes the many works used as a basis for this research. This research used the principles of Selberg’s 2008

  10. T-H-A-T-S: timber-harvesting-and-transport-simulator: with subroutines for Appalachian logging

    Treesearch

    A. Jeff Martin

    1975-01-01

    A computer program for simulating harvesting operations is presented. Written in FORTRAN IV, the program contains subroutines that were developed for Appalachian logging conditions. However, with appropriate modifications, the simulator would be applicable for most logging operations and locations. The details of model development and its methodology are presented,...

  11. Rating of Dynamic Coefficient for Simple Beam Bridge Design on High-Speed Railways

    NASA Astrophysics Data System (ADS)

    Diachenko, Leonid; Benin, Andrey; Smirnov, Vladimir; Diachenko, Anastasia

    2018-06-01

    The aim of the work is to improve the methodology for the dynamic computation of simple beam spans during the impact of high-speed trains. Mathematical simulation utilizing numerical and analytical methods of structural mechanics is used in the research. The article analyses parameters of the effect of high-speed trains on simple beam spanning bridge structures and suggests a technique of determining of the dynamic index to the live load. Reliability of the proposed methodology is confirmed by results of numerical simulation of high-speed train passage over spans with different speeds. The proposed algorithm of dynamic computation is based on a connection between maximum acceleration of the span in the resonance mode of vibrations and the main factors of stress-strain state. The methodology allows determining maximum and also minimum values of the main efforts in the construction that makes possible to perform endurance tests. It is noted that dynamic additions for the components of the stress-strain state (bending moments, transverse force and vertical deflections) are different. This condition determines the necessity for differentiated approach to evaluation of dynamic coefficients performing design verification of I and II groups of limiting state. The practical importance: the methodology of determining the dynamic coefficients allows making dynamic calculation and determining the main efforts in split beam spans without numerical simulation and direct dynamic analysis that significantly reduces the labour costs for design.

  12. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  13. Entropy Filtered Density Function for Large Eddy Simulation of Turbulent Reacting Flows

    NASA Astrophysics Data System (ADS)

    Safari, Mehdi

    Analysis of local entropy generation is an effective means to optimize the performance of energy and combustion systems by minimizing the irreversibilities in transport processes. Large eddy simulation (LES) is employed to describe entropy transport and generation in turbulent reacting flows. The entropy transport equation in LES contains several unclosed terms. These are the subgrid scale (SGS) entropy flux and entropy generation caused by irreversible processes: heat conduction, mass diffusion, chemical reaction and viscous dissipation. The SGS effects are taken into account using a novel methodology based on the filtered density function (FDF). This methodology, entitled entropy FDF (En-FDF), is developed and utilized in the form of joint entropy-velocity-scalar-turbulent frequency FDF and the marginal scalar-entropy FDF, both of which contain the chemical reaction effects in a closed form. The former constitutes the most comprehensive form of the En-FDF and provides closure for all the unclosed filtered moments. This methodology is applied for LES of a turbulent shear layer involving transport of passive scalars. Predictions show favor- able agreements with the data generated by direct numerical simulation (DNS) of the same layer. The marginal En-FDF accounts for entropy generation effects as well as scalar and entropy statistics. This methodology is applied to a turbulent nonpremixed jet flame (Sandia Flame D) and predictions are validated against experimental data. In both flows, sources of irreversibility are predicted and analyzed.

  14. Coarse-grained molecular dynamics simulations for giant protein-DNA complexes

    NASA Astrophysics Data System (ADS)

    Takada, Shoji

    Biomolecules are highly hierarchic and intrinsically flexible. Thus, computational modeling calls for multi-scale methodologies. We have been developing a coarse-grained biomolecular model where on-average 10-20 atoms are grouped into one coarse-grained (CG) particle. Interactions among CG particles are tuned based on atomistic interactions and the fluctuation matching algorithm. CG molecular dynamics methods enable us to simulate much longer time scale motions of much larger molecular systems than fully atomistic models. After broad sampling of structures with CG models, we can easily reconstruct atomistic models, from which one can continue conventional molecular dynamics simulations if desired. Here, we describe our CG modeling methodology for protein-DNA complexes, together with various biological applications, such as the DNA duplication initiation complex, model chromatins, and transcription factor dynamics on chromatin-like environment.

  15. Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design, and Data Management

    DTIC Science & Technology

    2014-01-01

    Aviation Human-in-the-Loop Simulation Studies: Experimental Planning, Design , and Data Management Kevin W. Williams1 Bonny Christopher2 Gena...Simulation Studies: Experimental Planning, Design , and Data Management January 2014 6. Performing Organization Code 7. Author(s) 8. Performing...describe the process by which we designed our human-in-the-loop (HITL) simulation study and the methodology used to collect and analyze the results

  16. MoSeS: Modelling and Simulation for e-Social Science.

    PubMed

    Townend, Paul; Xu, Jie; Birkin, Mark; Turner, Andy; Wu, Belinda

    2009-07-13

    MoSeS (Modelling and Simulation for e-Social Science) is a research node of the National Centre for e-Social Science. MoSeS uses e-Science techniques to execute an events-driven model that simulates discrete demographic processes; this allows us to project the UK population 25 years into the future. This paper describes the architecture, simulation methodology and latest results obtained by MoSeS.

  17. Membrane Insertion Profiles of Peptides Probed by Molecular Dynamics Simulations

    DTIC Science & Technology

    2008-07-17

    Membrane insertion profiles of peptides probed by molecular dynamics simulations In-Chul Yeh,* Mark A. Olson,# Michael S. Lee,*#§ and Anders...a methodology based on molecular dynamics simulation techniques to probe the insertion profiles of small peptides across the membrane interface. The...profiles of peptides probed by molecular dynamics simulations 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d

  18. Mathematical model of marine diesel engine simulator for a new methodology of self propulsion tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izzuddin, Nur; Sunarsih,; Priyanto, Agoes

    As a vessel operates in the open seas, a marine diesel engine simulator whose engine rotation is controlled to transmit through propeller shaft is a new methodology for the self propulsion tests to track the fuel saving in a real time. Considering the circumstance, this paper presents the real time of marine diesel engine simulator system to track the real performance of a ship through a computer-simulated model. A mathematical model of marine diesel engine and the propeller are used in the simulation to estimate fuel rate, engine rotating speed, thrust and torque of the propeller thus achieve the targetmore » vessel’s speed. The input and output are a real time control system of fuel saving rate and propeller rotating speed representing the marine diesel engine characteristics. The self-propulsion tests in calm waters were conducted using a vessel model to validate the marine diesel engine simulator. The simulator then was used to evaluate the fuel saving by employing a new mathematical model of turbochargers for the marine diesel engine simulator. The control system developed will be beneficial for users as to analyze different condition of vessel’s speed to obtain better characteristics and hence optimize the fuel saving rate.« less

  19. A Methodology for Evaluating the Fidelity of Ground-Based Flight Simulators

    NASA Technical Reports Server (NTRS)

    Zeyada, Y.; Hess, R. A.

    1999-01-01

    An analytical and experimental investigation was undertaken to model the manner in which pilots perceive and utilize visual, proprioceptive, and vestibular cues in a ground-based flight simulator. The study was part of a larger research effort which has the creation of a methodology for determining flight simulator fidelity requirements as its ultimate goal. The study utilized a closed-loop feedback structure of the pilot/simulator system which included the pilot, the cockpit inceptor, the dynamics of the simulated vehicle and the motion system. With the exception of time delays which accrued in visual scene production in the simulator, visual scene effects were not included in this study. The NASA Ames Vertical Motion Simulator was used in a simple, single-degree of freedom rotorcraft bob-up/down maneuver. Pilot/vehicle analysis and fuzzy-inference identification were employed to study the changes in fidelity which occurred as the characteristics of the motion system were varied over five configurations i The data from three of the five pilots that participated in the experimental study were analyzed in the fuzzy inference identification. Results indicate that both the analytical pilot/vehicle analysis and the fuzzyinference identification can be used to reflect changes in simulator fidelity for the task examined.

  20. A Methodology for Evaluating the Fidelity of Ground-Based Flight Simulators

    NASA Technical Reports Server (NTRS)

    Zeyada, Y.; Hess, R. A.

    1999-01-01

    An analytical and experimental investigation was undertaken to model the manner in which pilots perceive and utilize visual, proprioceptive, and vestibular cues in a ground-based flight simulator. The study was part of a larger research effort which has the creation of a methodology for determining flight simulator fidelity requirements as its ultimate goal. The study utilized a closed-loop feedback structure of the pilot/simulator system which included the pilot, the cockpit inceptor, the dynamics of the simulated vehicle and the motion system. With the exception of time delays which accrued in visual scene production in the simulator, visual scene effects were not included in this study. The NASA Ames Vertical Motion Simulator was used in a simple, single-degree of freedom rotorcraft bob-up/down maneuver. Pilot/vehicle analysis and fuzzy-inference identification were employed to study the changes in fidelity which occurred as the characteristics of the motion system were varied over five configurations. The data from three of the five pilots that participated in the experimental study were analyzed in the fuzzy-inference identification. Results indicate that both the analytical pilot/vehicle analysis and the fuzzy-inference identification can be used to reflect changes in simulator fidelity for the task examined.

  1. Simulating household travel study data in metropolitan areas : technical summary.

    DOT National Transportation Integrated Search

    2001-05-01

    The objectives of this study are: 1. To develop and validate a methodology for MPOs to synthesize household travel survey data using local sociodemographic characteristics in conjunction with a national source of simulated travel data. 2. To evalu...

  2. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, Part-II: Models And Procedures (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.

  3. Numerical Simulation of High-Speed Turbulent Reacting Flows

    NASA Technical Reports Server (NTRS)

    Givi, P.; Taulbee, D. B.; Madnia, C. K.; Jaberi, F. A.; Colucci, P. J.; Gicquel, L. Y. M.; Adumitroaie, V.; James, S.

    1999-01-01

    The objectives of this research are: (1) to develop and implement a new methodology for large eddy simulation of (LES) of high-speed reacting turbulent flows. (2) To develop algebraic turbulence closures for statistical description of chemically reacting turbulent flows.

  4. Simulation analysis of route diversion strategies for freeway incident management : final report.

    DOT National Transportation Integrated Search

    1995-02-01

    The purpose of this project was to investigate whether simulation models could : be used as decision aids for defining traffic diversion strategies for effective : incident management. A methodology was developed for using such a model to : determine...

  5. Forecasting Maintenance Shortcomings of a Planned Equipment Density Listing in Support of Expeditionary Missions

    DTIC Science & Technology

    2017-06-01

    importantly, it examines the methodology used to build the class IX block embarked on ship prior to deployment. The class IX block is defined as a repository...compared to historical data to evaluate model and simulation outputs. This thesis provides recommendations on improving the methodology implemented in...improving the level of organic support available to deployed units. More importantly, it examines the methodology used to build the class IX block

  6. Methodology for application of field rainfall simulator to revise c-factor database for conditions of the Czech Republic

    NASA Astrophysics Data System (ADS)

    Neumann, Martin; Dostál, Tomáš; Krása, Josef; Kavka, Petr; Davidová, Tereza; Brant, Václav; Kroulík, Milan; Mistr, Martin; Novotný, Ivan

    2016-04-01

    The presentation will introduce a methodology of determination of crop and cover management factor (C-faktor) for the universal soil loss equation (USLE) using field rainfall simulator. The aim of the project is to determine the C-factor value for the different phenophases of the main crops of the central-european region, while also taking into account the different agrotechnical methods. By using the field rainfall simulator, it is possible to perform the measurements in specific phenophases, which is otherwise difficult to execute due to the variability and fortuity of the natural rainfall. Due to the number of measurements needed, two identical simulators will be used, operated by two independent teams, with coordinated methodology. The methodology will mainly specify the length of simulation, the rainfall intensity, and the sampling technique. The presentation includes a more detailed account of the methods selected. Due to the wide range of variable crops and soils, it is not possible to execute the measurements for all possible combinations. We therefore decided to perform the measurements for previously selected combinations of soils,crops and agrotechnologies that are the most common in the Czech Republic. During the experiments, the volume of the surface runoff and amount of sediment will be measured in their temporal distribution, as well as several other important parameters. The key values of the 3D matrix of the combinations of the crop, agrotechnique and soil will be determined experimentally. The remaining values will be determined by interpolation or by a model analogy. There are several methods used for C-factor calculation from measured experimental data. Some of these are not suitable to be used considering the type of data gathered. The presentation will discuss the benefits and drawbacks of these methods, as well as the final design of the method used. The problems concerning the selection of a relevant measurement method as well as the final method of simulation and C-factor determination for the gathered data will be discussed in more detail. The presentation was supported by research projects QJ1530181 and SGS14/180/OHK1/3T/11.

  7. Educational Strategies for Teaching Basic Family Dynamics to Non-Family Therapists.

    ERIC Educational Resources Information Center

    Merkel, William T.; Rudisill, John R.

    1985-01-01

    Presents six-part methodology for teaching basic concepts of family systems to non-family therapists and describes application of methodology to teach primary care physicians. Explains use of simulated encounters in which a physically symptomatic adolescent is interviewed alone, then with his mother, then with his whole family. (Author/NRB)

  8. Level-Set Methodology on Adaptive Octree Grids

    NASA Astrophysics Data System (ADS)

    Gibou, Frederic; Guittet, Arthur; Mirzadeh, Mohammad; Theillard, Maxime

    2017-11-01

    Numerical simulations of interfacial problems in fluids require a methodology capable of tracking surfaces that can undergo changes in topology and capable to imposing jump boundary conditions in a sharp manner. In this talk, we will discuss recent advances in the level-set framework, in particular one that is based on adaptive grids.

  9. The Research and Evaluation of Serious Games: Toward a Comprehensive Methodology

    ERIC Educational Resources Information Center

    Mayer, Igor; Bekebrede, Geertje; Harteveld, Casper; Warmelink, Harald; Zhou, Qiqi; van Ruijven, Theo; Lo, Julia; Kortmann, Rens; Wenzler, Ivo

    2014-01-01

    The authors present the methodological background to and underlying research design of an ongoing research project on the scientific evaluation of serious games and/or computer-based simulation games (SGs) for advanced learning. The main research questions are: (1) what are the requirements and design principles for a comprehensive social…

  10. Interfacing Network Simulations and Empirical Data

    DTIC Science & Technology

    2009-05-01

    contraceptive innovations in the Cameroon. He found that real-world adoption rates did not follow simulation models when the network relationships were...Analysis of the Coevolution of Adolescents ’ Friendship Networks, Taste in Music, and Alcohol Consumption. Methodology, 2: 48-56. Tichy, N.M., Tushman

  11. Modeling and Simulation in Support of Testing and Evaluation

    DTIC Science & Technology

    1997-03-01

    contains standardized automated test methodology, synthetic stimuli and environments based on TECOM Ground Truth data and physics . The VPG is a distributed...Systems Acquisition Management (FSAM) coursebook , Defense Systems Management College, January 1994. Crocker, Charles M. “Application of the Simulation

  12. Computational simulation of coupled material degradation processes for probabilistic lifetime strength of aerospace materials

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.

    1992-01-01

    The research included ongoing development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primative variables. These primative variable may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above described constitutive equation using actual experimental materials data together with linear regression of that data, thereby predicting values for the empirical material constraints for each effect or primative variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from the open literature for materials typically of interest to those studying aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  13. An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.

    PubMed

    Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph

    2010-06-01

    We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.

  14. Model coupling methodology for thermo-hydro-mechanical-chemical numerical simulations in integrated assessment of long-term site behaviour

    NASA Astrophysics Data System (ADS)

    Kempka, Thomas; De Lucia, Marco; Kühn, Michael

    2015-04-01

    The integrated assessment of long-term site behaviour taking into account a high spatial resolution at reservoir scale requires a sophisticated methodology to represent coupled thermal, hydraulic, mechanical and chemical processes of relevance. Our coupling methodology considers the time-dependent occurrence and significance of multi-phase flow processes, mechanical effects and geochemical reactions (Kempka et al., 2014). Hereby, a simplified hydro-chemical coupling procedure was developed (Klein et al., 2013) and validated against fully coupled hydro-chemical simulations (De Lucia et al., 2015). The numerical simulation results elaborated for the pilot site Ketzin demonstrate that mechanical reservoir, caprock and fault integrity are maintained during the time of operation and that after 10,000 years CO2 dissolution is the dominating trapping mechanism and mineralization occurs on the order of 10 % to 25 % with negligible changes to porosity and permeability. De Lucia, M., Kempka, T., Kühn, M. A coupling alternative to reactive transport simulations for long-term prediction of chemical reactions in heterogeneous CO2 storage systems (2014) Geosci Model Dev Discuss 7:6217-6261. doi:10.5194/gmdd-7-6217-2014. Kempka, T., De Lucia, M., Kühn, M. Geomechanical integrity verification and mineral trapping quantification for the Ketzin CO2 storage pilot site by coupled numerical simulations (2014) Energy Procedia 63:3330-3338, doi:10.1016/j.egypro.2014.11.361. Klein E, De Lucia M, Kempka T, Kühn M. Evaluation of longterm mineral trapping at the Ketzin pilot site for CO2 storage: an integrative approach using geo-chemical modelling and reservoir simulation. Int J Greenh Gas Con 2013; 19:720-730. doi:10.1016/j.ijggc.2013.05.014.

  15. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  16. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-06-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  17. Evaluation of a methodology for model identification in the time domain

    NASA Technical Reports Server (NTRS)

    Beck, R. T.; Beck, J. L.

    1988-01-01

    A model identification methodology for structural dynamics has been applied to simulated vibrational data as a first step in evaluating its accuracy. The evaluation has taken into account a wide variety of factors which affect the accuracy of the procedure. The effects of each of these factors were observed in both the response time histories and the estimates of the parameters of the model by comparing them with the exact values of the system. Each factor was varied independently but combinations of these have also been considered in an effort to simulate real situations. The results of the tests have shown that for the chain model, the procedure yields robust estimates of the stiffness parameters under the conditions studied whenever uniqueness is ensured. When inaccuracies occur in the results, they are intimately related to non-uniqueness conditions inherent in the inverse problem and not to shortcomings in the methodology.

  18. Bayesian Inference on Proportional Elections

    PubMed Central

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  19. Bayesian inference on proportional elections.

    PubMed

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  20. An immersed boundary method for modeling a dirty geometry data

    NASA Astrophysics Data System (ADS)

    Onishi, Keiji; Tsubokura, Makoto

    2017-11-01

    We present a robust, fast, and low preparation cost immersed boundary method (IBM) for simulating an incompressible high Re flow around highly complex geometries. The method is achieved by the dispersion of the momentum by the axial linear projection and the approximate domain assumption satisfying the mass conservation around the wall including cells. This methodology has been verified against an analytical theory and wind tunnel experiment data. Next, we simulate the problem of flow around a rotating object and demonstrate the ability of this methodology to the moving geometry problem. This methodology provides the possibility as a method for obtaining a quick solution at a next large scale supercomputer. This research was supported by MEXT as ``Priority Issue on Post-K computer'' (Development of innovative design and production processes) and used computational resources of the K computer provided by the RIKEN Advanced Institute for Computational Science.

  1. A Multi-Objective Advanced Design Methodology of Composite Beam-to-Column Joints Subjected to Seismic and Fire Loads

    NASA Astrophysics Data System (ADS)

    Pucinotti, Raffaele; Ferrario, Fabio; Bursi, Oreste S.

    2008-07-01

    A multi-objective advanced design methodology dealing with seismic actions followed by fire on steel-concrete composite full strength joints with concrete filled tubes is proposed in this paper. The specimens were designed in detail in order to exhibit a suitable fire behaviour after a severe earthquake. The major aspects of the cyclic behaviour of composite joints are presented and commented upon. The data obtained from monotonic and cyclic experimental tests have been used to calibrate a model of the joint in order to perform seismic simulations on several moment resisting frames. A hysteretic law was used to take into account the seismic degradation of the joints. Finally, fire tests were conducted with the objective to evaluate fire resistance of the connection already damaged by an earthquake. The experimental activity together with FE simulation demonstrated the adequacy of the advanced design methodology.

  2. C-Based Design Methodology and Topological Change for an Indian Agricultural Tractor Component

    NASA Astrophysics Data System (ADS)

    Matta, Anil Kumar; Raju, D. Ranga; Suman, K. N. S.; Kranthi, A. S.

    2018-02-01

    The failure of tractor components and their replacement has now become very common in India because of re-cycling, re-sale, and duplication. To over come the problem of failure we propose a design methodology for topological change co-simulating with software's. In the proposed Design methodology, the designer checks Paxial, Pcr, Pfailue, τ by hand calculations, from which refined topological changes of R.S.Arm are formed. We explained several techniques employed in the component for reduction, removal of rib material to change center of gravity and centroid point by using system C for mixed level simulation and faster topological changes. The design process in system C can be compiled and executed with software, TURBO C7. The modified component is developed in proE and analyzed in ANSYS. The topologically changed component with slot 120 × 4.75 × 32.5 mm at the center showed greater effectiveness than the original component.

  3. A Geant4 simulation of the depth dose percentage in brain tumors treatments using protons and carbon ions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz, José A. M., E-mail: joadiazme@unal.edu.co; Torres, D. A., E-mail: datorresg@unal.edu.co

    2016-07-07

    The deposited energy and dose distribution of beams of protons and carbon over a head are simulated using the free tool package Geant4 and the data analysis package ROOT-C++. The present work shows a methodology to understand the microscopical process occurring in a session of hadron-therapy using advance simulation tools.

  4. A Methodology to Assess UrbanSim Scenarios

    DTIC Science & Technology

    2012-09-01

    Education LOE – Line of Effort MMOG – Massively Multiplayer Online Game MC3 – Maneuver Captain’s Career Course MSCCC – Maneuver Support...augmented reality simulations, increased automation and artificial intelligence simulation, and massively multiplayer online games (MMOG), among...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Turn-based strategy games and simulations are vital tools for military

  5. Parallel methodology to capture cyclic variability in motored engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ameen, Muhsin M.; Yang, Xiaofeng; Kuo, Tang-Wei

    2016-07-28

    Numerical prediction of of cycle-to-cycle variability (CCV) in SI engines is extremely challenging for two key reasons: (i) high-fidelity methods such as large eddy simulation (LES) are require to accurately capture the in-cylinder turbulent flowfield, and (ii) CCV is experienced over long timescales and hence the simulations need to be performed for hundreds of consecutive cycles. In this study, a new methodology is proposed to dissociate this long time-scale problem into several shorter time-scale problems, which can considerably reduce the computational time without sacrificing the fidelity of the simulations. The strategy is to perform multiple single-cycle simulations in parallel bymore » effectively perturbing the simulation parameters such as the initial and boundary conditions. It is shown that by perturbing the initial velocity field effectively based on the intensity of the in-cylinder turbulence, the mean and variance of the in-cylinder flowfield is captured reasonably well. Adding perturbations in the initial pressure field and the boundary pressure improves the predictions. It is shown that this new approach is able to give accurate predictions of the flowfield statistics in less than one-tenth of time required for the conventional approach of simulating consecutive engine cycles.« less

  6. A negotiation methodology and its application to cogeneration planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, S.M.; Liu, C.C.; Luu, S.

    Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dionne, B.; Tzanos, C. P.

    To support the safety analyses required for the conversion of the Belgian Reactor 2 (BR2) from highly-enriched uranium (HEU) to low-enriched uranium (LEU) fuel, the simulation of a number of loss-of-flow tests, with or without loss of pressure, has been undertaken. These tests were performed at BR2 in 1963 and used instrumented fuel assemblies (FAs) with thermocouples (TC) imbedded in the cladding as well as probes to measure the FAs power on the basis of their coolant temperature rise. The availability of experimental data for these tests offers an opportunity to better establish the credibility of the RELAP5-3D model andmore » methodology used in the conversion analysis. In order to support the HEU to LEU conversion safety analyses of the BR2 reactor, RELAP simulations of a number of loss-of-flow/loss-of-pressure tests have been undertaken. Preliminary analyses showed that the conservative power distributions used historically in the BR2 RELAP model resulted in a significant overestimation of the peak cladding temperature during the transient. Therefore, it was concluded that better estimates of the steady-state and decay power distributions were needed to accurately predict the cladding temperatures measured during the tests and establish the credibility of the RELAP model and methodology. The new approach ('best estimate' methodology) uses the MCNP5, ORIGEN-2 and BERYL codes to obtain steady-state and decay power distributions for the BR2 core during the tests A/400/1, C/600/3 and F/400/1. This methodology can be easily extended to simulate any BR2 core configuration. Comparisons with measured peak cladding temperatures showed a much better agreement when power distributions obtained with the new methodology are used.« less

  8. Model methodology for estimating pesticide concentration extremes based on sparse monitoring data

    USGS Publications Warehouse

    Vecchia, Aldo V.

    2018-03-22

    This report describes a new methodology for using sparse (weekly or less frequent observations) and potentially highly censored pesticide monitoring data to simulate daily pesticide concentrations and associated quantities used for acute and chronic exposure assessments, such as the annual maximum daily concentration. The new methodology is based on a statistical model that expresses log-transformed daily pesticide concentration in terms of a seasonal wave, flow-related variability, long-term trend, and serially correlated errors. Methods are described for estimating the model parameters, generating conditional simulations of daily pesticide concentration given sparse (weekly or less frequent) and potentially highly censored observations, and estimating concentration extremes based on the conditional simulations. The model can be applied to datasets with as few as 3 years of record, as few as 30 total observations, and as few as 10 uncensored observations. The model was applied to atrazine, carbaryl, chlorpyrifos, and fipronil data for U.S. Geological Survey pesticide sampling sites with sufficient data for applying the model. A total of 112 sites were analyzed for atrazine, 38 for carbaryl, 34 for chlorpyrifos, and 33 for fipronil. The results are summarized in this report; and, R functions, described in this report and provided in an accompanying model archive, can be used to fit the model parameters and generate conditional simulations of daily concentrations for use in investigations involving pesticide exposure risk and uncertainty.

  9. A methodology for identification and control of electro-mechanical actuators

    PubMed Central

    Tutunji, Tarek A.; Saleem, Ashraf

    2015-01-01

    Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants’ response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: • Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators. • Combines off-line and on-line controller design for practical performance. • Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure. PMID:26150992

  10. A methodology for identification and control of electro-mechanical actuators.

    PubMed

    Tutunji, Tarek A; Saleem, Ashraf

    2015-01-01

    Mechatronic systems are fully-integrated engineering systems that are composed of mechanical, electronic, and computer control sub-systems. These integrated systems use electro-mechanical actuators to cause the required motion. Therefore, the design of appropriate controllers for these actuators are an essential step in mechatronic system design. In this paper, a three-stage methodology for real-time identification and control of electro-mechanical actuator plants is presented, tested, and validated. First, identification models are constructed from experimental data to approximate the plants' response. Second, the identified model is used in a simulation environment for the purpose of designing a suitable controller. Finally, the designed controller is applied and tested on the real plant through Hardware-in-the-Loop (HIL) environment. The described three-stage methodology provides the following practical contributions: •Establishes an easy-to-follow methodology for controller design of electro-mechanical actuators.•Combines off-line and on-line controller design for practical performance.•Modifies the HIL concept by using physical plants with computer control (rather than virtual plants with physical controllers). Simulated and experimental results for two case studies, induction motor and vehicle drive system, are presented in order to validate the proposed methodology. These results showed that electromechanical actuators can be identified and controlled using an easy-to-duplicate and flexible procedure.

  11. A novel simulation methodology merging source-sink dynamics and landscape connectivity

    EPA Science Inventory

    Source-sink dynamics are an emergent property of complex species-landscape interactions. This study explores the patterns of source and sink behavior that become established across a large landscape, using a simulation model for the northern spotted owl (Strix occidentalis cauri...

  12. Safety of railroad passenger vehicle dynamics : OMNISIM simulation and test correlations for passenger rail cars

    DOT National Transportation Integrated Search

    2002-07-01

    The purpose of the work is to validate the safety assessment methodology previously developed for passenger rail vehicle dynamics, which requires the application of simulation tools as well as testing of vehicles under different track scenarios. This...

  13. Developing a Simulated-Person Methodology Workshop: An Experiential Education Initiative for Educators and Simulators

    ERIC Educational Resources Information Center

    Peisachovich, Eva Hava; Nelles, L. J.; Johnson, Samantha; Nicholson, Laura; Gal, Raya; Kerr, Barbara; Celia, Popovic; Epstein, Iris; Da Silva, Celina

    2017-01-01

    Numerous forecasts suggest that professional-competence development depends on human encounters. Interaction between organizations, tasks, and individual providers influence human behaviour, affect organizations' or systems' performance, and are a key component of professional-competence development. Further, insufficient or ineffective…

  14. Residual stress investigation of via-last through-silicon via by polarized Raman spectroscopy measurement and finite element simulation

    NASA Astrophysics Data System (ADS)

    Feng, Wei; Watanabe, Naoya; Shimamoto, Haruo; Aoyagi, Masahiro; Kikuchi, Katsuya

    2018-07-01

    The residual stresses induced around through-silicon vias (TSVs) by a fabrication process is one of the major concerns of reliability. We proposed a methodology to investigate the residual stress in a via-last TSV. Firstly, radial and axial thermal stresses were measured by polarized Raman spectroscopy. The agreement between the simulated stress level and measured results validated the detail simulation model. Furthermore, the validated simulation model was adopted to the study of residual stress by element death/birth methods. The residual stress at room temperature concentrates at passivation layers owing to the high fabrication process temperatures of 420 °C for SiN film and 350 °C for SiO2 films. For a Si substrate, a high-level stress was observed near potential device locations, which requires attention to address reliability concerns in stress-sensitive devices. This methodology of residual stress analysis can be adopted to investigate the residual stress in other devices.

  15. Modeling Negotiation by a Paticipatory Approach

    NASA Astrophysics Data System (ADS)

    Torii, Daisuke; Ishida, Toru; Bousquet, François

    In a participatory approach by social scientists, role playing games (RPG) are effectively used to understand real thinking and behavior of stakeholders, but RPG is not sufficient to handle a dynamic process like negotiation. In this study, a participatory simulation where user-controlled avatars and autonomous agents coexist is introduced to the participatory approach for modeling negotiation. To establish a modeling methodology of negotiation, we have tackled the following two issues. First, for enabling domain experts to concentrate interaction design for participatory simulation, we have adopted the architecture in which an interaction layer controls agents and have defined three types of interaction descriptions (interaction protocol, interaction scenario and avatar control scenario) to be described. Second, for enabling domain experts and stakeholders to capitalize on participatory simulation, we have established a four-step process for acquiring negotiation model: 1) surveys and interviews to stakeholders, 2) RPG, 3) interaction design, and 4) participatory simulation. Finally, we discussed our methodology through a case study of agricultural economics in the northeast Thailand.

  16. Simulation of Ejecta Production and Mixing Process of Sn Sample under shock loading

    NASA Astrophysics Data System (ADS)

    Wang, Pei; Chen, Dawei; Sun, Haiquan; Ma, Dongjun

    2017-06-01

    Ejection may occur when a strong shock wave release at the free surface of metal material and the ejecta of high-speed particulate matter will be formed and further mixed with the surrounding gas. Ejecta production and its mixing process has been one of the most difficult problems in shock physics remain unresolved, and have many important engineering applications in the imploding compression science. The present paper will introduce a methodology for the theoretical modeling and numerical simulation of the complex ejection and mixing process. The ejecta production is decoupled with the particle mixing process, and the ejecta state can be achieved by the direct numerical simulation for the evolution of initial defect on the metal surface. Then the particle mixing process can be simulated and resolved by a two phase gas-particle model which uses the aforementioned ejecta state as the initial condition. A preliminary ejecta experiment of planar Sn metal Sample has validated the feasibility of the proposed methodology.

  17. Simulation of thermomechanical fatigue in solder joints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, H.E.; Porter, V.L.; Fye, R.M.

    1997-12-31

    Thermomechanical fatigue (TMF) is a very complex phenomenon in electronic component systems and has been identified as one prominent degradation mechanism for surface mount solder joints in the stockpile. In order to precisely predict the TMF-related effects on the reliability of electronic components in weapons, a multi-level simulation methodology is being developed at Sandia National Laboratories. This methodology links simulation codes of continuum mechanics (JAS3D), microstructural mechanics (GLAD), and microstructural evolution (PARGRAIN) to treat the disparate length scales that exist between the macroscopic response of the component and the microstructural changes occurring in its constituent materials. JAS3D is used tomore » predict strain/temperature distributions in the component due to environmental variable fluctuations. GLAD identifies damage initiation and accumulation in detail based on the spatial information provided by JAS3D. PARGRAIN simulates the changes of material microstructure, such as the heterogeneous coarsening in Sn-Pb solder, when the component`s service environment varies.« less

  18. Automated chemical kinetic modeling via hybrid reactive molecular dynamics and quantum chemistry simulations.

    PubMed

    Döntgen, Malte; Schmalz, Felix; Kopp, Wassja A; Kröger, Leif C; Leonhard, Kai

    2018-06-13

    An automated scheme for obtaining chemical kinetic models from scratch using reactive molecular dynamics and quantum chemistry simulations is presented. This methodology combines the phase space sampling of reactive molecular dynamics with the thermochemistry and kinetics prediction capabilities of quantum mechanics. This scheme provides the NASA polynomial and modified Arrhenius equation parameters for all species and reactions that are observed during the simulation and supplies them in the ChemKin format. The ab initio level of theory for predictions is easily exchangeable and the presently used G3MP2 level of theory is found to reliably reproduce hydrogen and methane oxidation thermochemistry and kinetics data. Chemical kinetic models obtained with this approach are ready-to-use for, e.g., ignition delay time simulations, as shown for hydrogen combustion. The presented extension of the ChemTraYzer approach can be used as a basis for methodologically advancing chemical kinetic modeling schemes and as a black-box approach to generate chemical kinetic models.

  19. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  20. Substructuring of multibody systems for numerical transfer path analysis in internal combustion engines

    NASA Astrophysics Data System (ADS)

    Acri, Antonio; Offner, Guenter; Nijman, Eugene; Rejlek, Jan

    2016-10-01

    Noise legislations and the increasing customer demands determine the Noise Vibration and Harshness (NVH) development of modern commercial vehicles. In order to meet the stringent legislative requirements for the vehicle noise emission, exact knowledge of all vehicle noise sources and their acoustic behavior is required. Transfer path analysis (TPA) is a fairly well established technique for estimating and ranking individual low-frequency noise or vibration contributions via the different transmission paths. Transmission paths from different sources to target points of interest and their contributions can be analyzed by applying TPA. This technique is applied on test measurements, which can only be available on prototypes, at the end of the designing process. In order to overcome the limits of TPA, a numerical transfer path analysis methodology based on the substructuring of a multibody system is proposed in this paper. Being based on numerical simulation, this methodology can be performed starting from the first steps of the designing process. The main target of the proposed methodology is to get information of noise sources contributions of a dynamic system considering the possibility to have multiple forces contemporary acting on the system. The contributions of these forces are investigated with particular focus on distribute or moving forces. In this paper, the mathematical basics of the proposed methodology and its advantages in comparison with TPA will be discussed. Then, a dynamic system is investigated with a combination of two methods. Being based on the dynamic substructuring (DS) of the investigated model, the methodology proposed requires the evaluation of the contact forces at interfaces, which are computed with a flexible multi-body dynamic (FMBD) simulation. Then, the structure-borne noise paths are computed with the wave based method (WBM). As an example application a 4-cylinder engine is investigated and the proposed methodology is applied on the engine block. The aim is to get accurate and clear relationships between excitations and responses of the simulated dynamic system, analyzing the noise and vibrational sources inside a car engine, showing the main advantages of a numerical methodology.

  1. The 2017 Academic Emergency Medicine Consensus Conference: Catalyzing System Change Through Healthcare Simulation: Systems, Competency, and Outcomes.

    PubMed

    Bond, William F; Hui, Joshua; Fernandez, Rosemarie

    2018-02-01

    Over the past decade, emergency medicine (EM) took a lead role in healthcare simulation in part due to its demands for successful interprofessional and multidisciplinary collaboration, along with educational needs in a diverse array of cognitive and procedural skills. Simulation-based methodologies have the capacity to support training and research platforms that model micro-, meso-, and macrosystems of healthcare. To fully capitalize on the potential of simulation-based research to improve emergency healthcare delivery will require the application of rigorous methods from engineering, social science, and basic science disciplines. The Academic Emergency Medicine (AEM) Consensus Conference "Catalyzing System Change Through Healthcare Simulation: Systems, Competency, and Outcome" was conceived to foster discussion among experts in EM, engineering, and social sciences, focusing on key barriers and opportunities in simulation-based research. This executive summary describes the overall rationale for the conference, conference planning, and consensus-building approaches and outlines the focus of the eight breakout sessions. The consensus outcomes from each breakout session are summarized in proceedings papers published in this issue of Academic Emergency Medicine. Each paper provides an overview of methodologic and knowledge gaps in simulation research and identifies future research targets aimed at improving the safety and quality of healthcare. © 2017 by the Society for Academic Emergency Medicine.

  2. Technology CAD for integrated circuit fabrication technology development and technology transfer

    NASA Astrophysics Data System (ADS)

    Saha, Samar

    2003-07-01

    In this paper systematic simulation-based methodologies for integrated circuit (IC) manufacturing technology development and technology transfer are presented. In technology development, technology computer-aided design (TCAD) tools are used to optimize the device and process parameters to develop a new generation of IC manufacturing technology by reverse engineering from the target product specifications. While in technology transfer to manufacturing co-location, TCAD is used for process centering with respect to high-volume manufacturing equipment of the target manufacturing equipment of the target manufacturing facility. A quantitative model is developed to demonstrate the potential benefits of the simulation-based methodology in reducing the cycle time and cost of typical technology development and technology transfer projects over the traditional practices. The strategy for predictive simulation to improve the effectiveness of a TCAD-based project, is also discussed.

  3. Optimization as a Tool for Consistency Maintenance in Multi-Resolution Simulation

    NASA Technical Reports Server (NTRS)

    Drewry, Darren T; Reynolds, Jr , Paul F; Emanuel, William R

    2006-01-01

    The need for new approaches to the consistent simulation of related phenomena at multiple levels of resolution is great. While many fields of application would benefit from a complete and approachable solution to this problem, such solutions have proven extremely difficult. We present a multi-resolution simulation methodology that uses numerical optimization as a tool for maintaining external consistency between models of the same phenomena operating at different levels of temporal and/or spatial resolution. Our approach follows from previous work in the disparate fields of inverse modeling and spacetime constraint-based animation. As a case study, our methodology is applied to two environmental models of forest canopy processes that make overlapping predictions under unique sets of operating assumptions, and which execute at different temporal resolutions. Experimental results are presented and future directions are addressed.

  4. Low-Order Modeling of Internal Heat Transfer in Biomass Particle Pyrolysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiggins, Gavin M.; Ciesielski, Peter N.; Daw, C. Stuart

    2016-06-16

    We present a computationally efficient, one-dimensional simulation methodology for biomass particle heating under conditions typical of fast pyrolysis. Our methodology is based on identifying the rate limiting geometric and structural factors for conductive heat transport in biomass particle models with realistic morphology to develop low-order approximations that behave appropriately. Comparisons of transient temperature trends predicted by our one-dimensional method with three-dimensional simulations of woody biomass particles reveal good agreement, if the appropriate equivalent spherical diameter and bulk thermal properties are used. We conclude that, for particle sizes and heating regimes typical of fast pyrolysis, it is possible to simulate biomassmore » particle heating with reasonable accuracy and minimal computational overhead, even when variable size, aspherical shape, anisotropic conductivity, and complex, species-specific internal pore geometry are incorporated.« less

  5. Kalman approach to accuracy management for interoperable heterogeneous model abstraction within an HLA-compliant simulation

    NASA Astrophysics Data System (ADS)

    Leskiw, Donald M.; Zhau, Junmei

    2000-06-01

    This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.

  6. Artificial Intelligence-Based Models for the Optimal and Sustainable Use of Groundwater in Coastal Aquifers

    NASA Astrophysics Data System (ADS)

    Sreekanth, J.; Datta, Bithin

    2011-07-01

    Overexploitation of the coastal aquifers results in saltwater intrusion. Once saltwater intrusion occurs, it involves huge cost and long-term remediation measures to remediate these contaminated aquifers. Hence, it is important to have strategies for the sustainable use of coastal aquifers. This study develops a methodology for the optimal management of saltwater intrusion prone aquifers. A linked simulation-optimization-based management strategy is developed. The methodology uses genetic-programming-based models for simulating the aquifer processes, which is then linked to a multi-objective genetic algorithm to obtain optimal management strategies in terms of groundwater extraction from potential well locations in the aquifer.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radojcic, Riko; Nowak, Matt; Nakamoto, Mark

    The status of the development of a Design-for-Stress simulation flow that captures the stress effects in packaged 3D-stacked Si products like integrated circuits (ICs) using advanced via-middle Through Si Via technology is outlined. The next set of challenges required to proliferate the methodology and to deploy it for making and dispositioning real Si product decisions are described here. These include the adoption and support of a Process Design Kit (PDK) that includes the relevant material properties, the development of stress simulation methodologies that operate at higher levels of abstraction in a design flow, and the development and adoption of suitablemore » models required to make real product reliability decisions.« less

  8. An almost-parameter-free harmony search algorithm for groundwater pollution source identification.

    PubMed

    Jiang, Simin; Zhang, Yali; Wang, Pei; Zheng, Maohui

    2013-01-01

    The spatiotemporal characterization of unknown sources of groundwater pollution is frequently encountered in environmental problems. This study adopts a simulation-optimization approach that combines a contaminant transport simulation model with a heuristic harmony search algorithm to identify unknown pollution sources. In the proposed methodology, an almost-parameter-free harmony search algorithm is developed. The performance of this methodology is evaluated on an illustrative groundwater pollution source identification problem, and the identified results indicate that the proposed almost-parameter-free harmony search algorithm-based optimization model can give satisfactory estimations, even when the irregular geometry, erroneous monitoring data, and prior information shortage of potential locations are considered.

  9. Scalable tuning of building models to hourly data

    DOE PAGES

    Garrett, Aaron; New, Joshua Ryan

    2015-03-31

    Energy models of existing buildings are unreliable unless calibrated so they correlate well with actual energy usage. Manual tuning requires a skilled professional, is prohibitively expensive for small projects, imperfect, non-repeatable, non-transferable, and not scalable to the dozens of sensor channels that smart meters, smart appliances, and cheap/ubiquitous sensors are beginning to make available today. A scalable, automated methodology is needed to quickly and intelligently calibrate building energy models to all available data, increase the usefulness of those models, and facilitate speed-and-scale penetration of simulation-based capabilities into the marketplace for actualized energy savings. The "Autotune'' project is a novel, model-agnosticmore » methodology which leverages supercomputing, large simulation ensembles, and big data mining with multiple machine learning algorithms to allow automatic calibration of simulations that match measured experimental data in a way that is deployable on commodity hardware. This paper shares several methodologies employed to reduce the combinatorial complexity to a computationally tractable search problem for hundreds of input parameters. Furthermore, accuracy metrics are provided which quantify model error to measured data for either monthly or hourly electrical usage from a highly-instrumented, emulated-occupancy research home.« less

  10. Computational study of configurational and vibrational contributions to the thermodynamics of substitutional alloys: The case of Ni3Al

    NASA Astrophysics Data System (ADS)

    Michelon, M. F.; Antonelli, A.

    2010-03-01

    We have developed a methodology to study the thermodynamics of order-disorder transformations in n -component substitutional alloys that combines nonequilibrium methods, which can efficiently compute free energies, with Monte Carlo simulations, in which configurational and vibrational degrees of freedom are simultaneously considered on an equal footing basis. Furthermore, with this methodology one can easily perform simulations in the canonical and in the isobaric-isothermal ensembles, which allow the investigation of the bulk volume effect. We have applied this methodology to calculate configurational and vibrational contributions to the entropy of the Ni3Al alloy as functions of temperature. The simulations show that when the volume of the system is kept constant, the vibrational entropy does not change upon transition while constant-pressure calculations indicate that the volume increase at the order-disorder transition causes a vibrational entropy increase of 0.08kB/atom . This is significant when compared to the configurational entropy increase of 0.27kB/atom . Our calculations also indicate that the inclusion of vibrations reduces in about 30% the order-disorder transition temperature determined solely considering the configurational degrees of freedom.

  11. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Technical Reports Server (NTRS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1994-01-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  12. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  13. Performance and cost evaluation of health information systems using micro-costing and discrete-event simulation.

    PubMed

    Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent

    2018-06-01

    Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.

  14. Simulation of a complete X-ray digital radiographic system for industrial applications.

    PubMed

    Nazemi, E; Rokrok, B; Movafeghi, A; Choopan Dastjerdi, M H

    2018-05-19

    Simulating X-ray images is of great importance in industry and medicine. Using such simulation permits us to optimize parameters which affect image's quality without the limitations of an experimental procedure. This study revolves around a novel methodology to simulate a complete industrial X-ray digital radiographic system composed of an X-ray tube and a computed radiography (CR) image plate using Monte Carlo N Particle eXtended (MCNPX) code. In the process of our research, an industrial X-ray tube with maximum voltage of 300 kV and current of 5 mA was simulated. A 3-layer uniform plate including a polymer overcoat layer, a phosphor layer and a polycarbonate backing layer was also defined and simulated as the CR imaging plate. To model the image formation in the image plate, at first the absorbed dose was calculated in each pixel inside the phosphor layer of CR imaging plate using the mesh tally in MCNPX code and then was converted to gray value using a mathematical relationship determined in a separate procedure. To validate the simulation results, an experimental setup was designed and the images of two step wedges created out of aluminum and steel were captured by the experiments and compared with the simulations. The results show that the simulated images are in good agreement with the experimental ones demonstrating the ability of the proposed methodology for simulating an industrial X-ray imaging system. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Realistic micromechanical modeling and simulation of two-phase heterogeneous materials

    NASA Astrophysics Data System (ADS)

    Sreeranganathan, Arun

    This dissertation research focuses on micromechanical modeling and simulations of two-phase heterogeneous materials exhibiting anisotropic and non-uniform microstructures with long-range spatial correlations. Completed work involves development of methodologies for realistic micromechanical analyses of materials using a combination of stereological techniques, two- and three-dimensional digital image processing, and finite element based modeling tools. The methodologies are developed via its applications to two technologically important material systems, namely, discontinuously reinforced aluminum composites containing silicon carbide particles as reinforcement, and boron modified titanium alloys containing in situ formed titanium boride whiskers. Microstructural attributes such as the shape, size, volume fraction, and spatial distribution of the reinforcement phase in these materials were incorporated in the models without any simplifying assumptions. Instrumented indentation was used to determine the constitutive properties of individual microstructural phases. Micromechanical analyses were performed using realistic 2D and 3D models and the results were compared with experimental data. Results indicated that 2D models fail to capture the deformation behavior of these materials and 3D analyses are required for realistic simulations. The effect of clustering of silicon carbide particles and associated porosity on the mechanical response of discontinuously reinforced aluminum composites was investigated using 3D models. Parametric studies were carried out using computer simulated microstructures incorporating realistic microstructural attributes. The intrinsic merit of this research is the development and integration of the required enabling techniques and methodologies for representation, modeling, and simulations of complex geometry of microstructures in two- and three-dimensional space facilitating better understanding of the effects of microstructural geometry on the mechanical behavior of materials.

  16. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    NASA Astrophysics Data System (ADS)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  17. Methodology for the Preliminary Design of High Performance Schools in Hot and Humid Climates

    ERIC Educational Resources Information Center

    Im, Piljae

    2009-01-01

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the…

  18. A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis

    DTIC Science & Technology

    2012-01-01

    probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY

  19. Dynamic determination of kinetic parameters and computer simulation of growth of Clostridium perfringens in cooked beef

    USDA-ARS?s Scientific Manuscript database

    The objective of this research was to develop a new one-step methodology that uses a dynamic approach to directly construct a tertiary model for prediction of the growth of C. perfringens in cooked beef. This methodology was based on numerical analysis and optimization of both primary and secondary...

  20. Lithographic process window optimization for mask aligner proximity lithography

    NASA Astrophysics Data System (ADS)

    Voelkel, Reinhard; Vogler, Uwe; Bramati, Arianna; Erdmann, Andreas; Ünal, Nezih; Hofmann, Ulrich; Hennemeyer, Marc; Zoberbier, Ralph; Nguyen, David; Brugger, Juergen

    2014-03-01

    We introduce a complete methodology for process window optimization in proximity mask aligner lithography. The commercially available lithography simulation software LAB from GenISys GmbH was used for simulation of light propagation and 3D resist development. The methodology was tested for the practical example of lines and spaces, 5 micron half-pitch, printed in a 1 micron thick layer of AZ® 1512HS1 positive photoresist on a silicon wafer. A SUSS MicroTec MA8 mask aligner, equipped with MO Exposure Optics® was used in simulation and experiment. MO Exposure Optics® is the latest generation of illumination systems for mask aligners. MO Exposure Optics® provides telecentric illumination and excellent light uniformity over the full mask field. MO Exposure Optics® allows the lithography engineer to freely shape the angular spectrum of the illumination light (customized illumination), which is a mandatory requirement for process window optimization. Three different illumination settings have been tested for 0 to 100 micron proximity gap. The results obtained prove, that the introduced process window methodology is a major step forward to obtain more robust processes in mask aligner lithography. The most remarkable outcome of the presented study is that a smaller exposure gap does not automatically lead to better print results in proximity lithography - what the "good instinct" of a lithographer would expect. With more than 5'000 mask aligners installed in research and industry worldwide, the proposed process window methodology might have significant impact on yield improvement and cost saving in industry.

  1. Numerical aerodynamic simulation facility preliminary study: Executive study

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A computing system was designed with the capability of providing an effective throughput of one billion floating point operations per second for three dimensional Navier-Stokes codes. The methodology used in defining the baseline design, and the major elements of the numerical aerodynamic simulation facility are described.

  2. Effect of Accessory Power Take-off Variation on a Turbofan Engine Performance

    DTIC Science & Technology

    2012-09-26

    amount of energy from the low pressure spool shaft. A high bypass turbofan engine was modeled using the Numerical Propulsion System Simulation ( NPSS ...4 II.2 Power Extraction Techniques ..........................................................................8 II.3 NPSS ...Methodology and Simulation Setup ...........................................................................25 III.1 Engine NPSS Model

  3. Understanding Contamination; Twenty Years of Simulating Radiological Contamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emily Snyder; John Drake; Ryan James

    A wide variety of simulated contamination methods have been developed by researchers to reproducibly test radiological decontamination methods. Some twenty years ago a method of non-radioactive contamination simulation was proposed at the Idaho National Laboratory (INL) that mimicked the character of radioactive cesium and zirconium contamination on stainless steel. It involved baking the contamination into the surface of the stainless steel in order to 'fix' it into a tenacious, tightly bound oxide layer. This type of contamination was particularly applicable to nuclear processing facilities (and nuclear reactors) where oxide growth and exchange of radioactive materials within the oxide layer becamemore » the predominant model for material/contaminant interaction. Additional simulation methods and their empirically derived basis (from a nuclear fuel reprocessing facility) are discussed. In the last ten years the INL, working with the Defense Advanced Research Projects Agency (DARPA) and the National Homeland Security Research Center (NHSRC), has continued to develop contamination simulation methodologies. The most notable of these newer methodologies was developed to compare the efficacy of different decontamination technologies against radiological dispersal device (RDD, 'dirty bomb') type of contamination. There are many different scenarios for how RDD contamination may be spread, but the most commonly used one at the INL involves the dispersal of an aqueous solution containing radioactive Cs-137. This method was chosen during the DARPA projects and has continued through the NHSRC series of decontamination trials and also gives a tenacious 'fixed' contamination. Much has been learned about the interaction of cesium contamination with building materials, particularly concrete, throughout these tests. The effects of porosity, cation-exchange capacity of the material and the amount of dirt and debris on the surface are very important factors. The interaction of the contaminant/substrate with the particular decontamination technology is also very important. Results of decontamination testing from hundreds of contaminated coupons have lead to certain conclusions about the contamination and the type of decontamination methods being deployed. A recent addition to the DARPA initiated methodology simulates the deposition of nuclear fallout. This contamination differs from previous tests in that it has been developed and validated purely to simulate a 'loose' type of contamination. This may represent the first time that a radiologically contaminated 'fallout' stimulant has been developed to reproducibly test decontamination methods. While no contaminant/methodology may serve as a complete example of all aspects that could be seen in the field, the study of this family of simulation methods provides insight into the nature of radiological contamination.« less

  4. Establish an Agent-Simulant Technology Relationship (ASTR)

    DTIC Science & Technology

    2017-04-14

    for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT

  5. Combat Simulation Using Breach Computer Language

    DTIC Science & Technology

    1979-09-01

    simulation and weapon system analysis computer language Two types of models were constructed: a stochastic duel and a dynamic engagement model The... duel model validates the BREACH approach by comparing results with mathematical solutions. The dynamic model shows the capability of the BREACH...BREACH 2 Background 2 The Language 3 Static Duel 4 Background and Methodology 4 Validation 5 Results 8 Tank Duel Simulation 8 Dynamic Assault Model

  6. Free-Energy Profiles of Membrane Insertion of the M2 Transmembrane Peptide from Influenza A Virus

    DTIC Science & Technology

    2008-12-01

    ABSTRACT The insertion of the M2 transmembrane peptide from influenza A virus into a membrane has been studied with molecular - dynamics simulations ...performed replica-exchange molecular - dynamics simulations with umbrella-sampling techniques to characterize the probability distribution and conformation...atomic- detailed molecular dynamics (MD) simulation techniques represent a valuable complementary methodology to inves- tigate membrane-insertion of

  7. Simulators IV; Proceedings of the SCS Conference, Orlando, FL, Apr. 6-9, 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fairchild, B.T.

    1987-01-01

    The conference presents papers on the applicability of AI techniques to simulation models, the simulation of a reentry vehicle on Simstar, simstar missile simulation, measurement issues associated with simulator sickness, and tracing the etiology of simulator sickness. Consideration is given to a simulator of a steam generator tube bundle response to a blowdown transient, the census of simulators for fossil fueled boiler and gas turbine plant operation training, and a new approach for flight simulator visual systems. Other topics include past and present simulated aircraft maintenance trainers, an AI-simulation based approach for aircraft maintenance training, simulator qualification using EPRI methodology,more » and the role of instinct in organizational dysfunction.« less

  8. Virtual reality and live simulation: a comparison between two simulation tools for assessing mass casualty triage skills.

    PubMed

    Luigi Ingrassia, Pier; Ragazzoni, Luca; Carenzo, Luca; Colombo, Davide; Ripoll Gallardo, Alba; Della Corte, Francesco

    2015-04-01

    This study tested the hypothesis that virtual reality simulation is equivalent to live simulation for testing naive medical students' abilities to perform mass casualty triage using the Simple Triage and Rapid Treatment (START) algorithm in a simulated disaster scenario and to detect the improvement in these skills after a teaching session. Fifty-six students in their last year of medical school were randomized into two groups (A and B). The same scenario, a car accident, was developed identically on the two simulation methodologies: virtual reality and live simulation. On day 1, group A was exposed to the live scenario and group B was exposed to the virtual reality scenario, aiming to triage 10 victims. On day 2, all students attended a 2-h lecture on mass casualty triage, specifically the START triage method. On day 3, groups A and B were crossed over. The groups' abilities to perform mass casualty triage in terms of triage accuracy, intervention correctness, and speed in the scenarios were assessed. Triage and lifesaving treatment scores were assessed equally by virtual reality and live simulation on day 1 and on day 3. Both simulation methodologies detected an improvement in triage accuracy and treatment correctness from day 1 to day 3 (P<0.001). The time to complete each scenario and its decrease from day 1 to day 3 were detected equally in the two groups (P<0.05). Virtual reality simulation proved to be a valuable tool, equivalent to live simulation, to test medical students' abilities to perform mass casualty triage and to detect improvement in such skills.

  9. JASMINE simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Tsujimoto, Takuji; Suganuma, Masahiro; Niwa, Yoshito; Sako, Nobutada; Hatsutori, Yoichi; Tanaka, Takashi

    2006-06-01

    We explain simulation tools in JASMINE project (JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations into error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented(OO) methodologies are ideal tools for the simulation system of JASMINE(the JASMINE simulator). In this article, we explain the framework of the JASMINE simulator.

  10. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Runyon, Christopher

    2014-01-01

    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  11. Evaluation of methodology for detecting/predicting migration of forest species

    Treesearch

    Dale S. Solomon; William B. Leak

    1996-01-01

    Available methods for analyzing migration of forest species are evaluated, including simulation models, remeasured plots, resurveys, pollen/vegetation analysis, and age/distance trends. Simulation models have provided some of the most drastic estimates of species changes due to predicted changes in global climate. However, these models require additional testing...

  12. Bootstrapping Methods Applied for Simulating Laboratory Works

    ERIC Educational Resources Information Center

    Prodan, Augustin; Campean, Remus

    2005-01-01

    Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…

  13. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; And Others

    The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…

  14. Enhancing Student Engagement through Simulation in Programming Sessions

    ERIC Educational Resources Information Center

    Isiaq, Sakirulai Olufemi; Jamil, Md Golam

    2018-01-01

    Purpose: The purpose of this paper is to explore the use of a simulator for teaching programming to foster student engagement and meaningful learning. Design/methodology/approach: An exploratory mixed-method research approach was adopted in a classroom-based environment at a UK university. A rich account of student engagement dimensions…

  15. Engaging Workers in Simulation-Based E-Learning

    ERIC Educational Resources Information Center

    Slotte, Virpi; Herbert, Anne

    2008-01-01

    Purpose: The purpose of this paper is to evaluate learners' attitudes to the use of simulation-based e-learning as part of workplace learning when socially situated interaction and blended learning are specifically included in the instructional design. Design/methodology/approach: Responses to a survey questionnaire of 298 sales personnel were…

  16. Costing Educational Wastage: A Pilot Simulation Study. Current Surveys and Research in Statistics.

    ERIC Educational Resources Information Center

    Berstecher, D.

    This pilot simulation study examines the important methodological problems involved in costing educational wastage, focusing specifically on the cost implications of educational wastage in primary education. Purpose of the study is to provide a clearer picture of the underlying rationale and interrelated consequences of reducing educational…

  17. Automatic mathematical modeling for real time simulation program (AI application)

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1989-01-01

    A methodology is described for automatic mathematical modeling and generating simulation models. The major objective was to create a user friendly environment for engineers to design, maintain, and verify their models; to automatically convert the mathematical models into conventional code for computation; and finally, to document the model automatically.

  18. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  19. Experience of nursing students with standardized patients in simulation-based learning: Q-methodology study.

    PubMed

    Ha, Eun-Ho

    2018-04-23

    Standardized patients (SPs) boost self-confidence, improve problem solving, enhance critical thinking, and advance clinical judgment of nursing students. The aim of this study was to examine nursing students' experience with SPs in simulation-based learning. Q-methodology was used. Department of nursing in Seoul, South Korea. Fourth-year undergraduate nursing students (n = 47). A total of 47 fourth-year undergraduate nursing students ranked 42 Q statements about experiences with SPs into a normal distribution grid. The following three viewpoints were obtained: 1) SPs are helpful for patient care (patient-centered view), 2) SPs roles are important for nursing student learning (SPs roles-centered view), and 3) SPs can promote competency of nursing students (student-centered view). These results indicate that SPs may improve nursing students' confidence and nursing competency. Professors should reflect these three viewpoints in simulation-based learning to effectively engage SPs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Entangled trajectories Hamiltonian dynamics for treating quantum nuclear effects

    NASA Astrophysics Data System (ADS)

    Smith, Brendan; Akimov, Alexey V.

    2018-04-01

    A simple and robust methodology, dubbed Entangled Trajectories Hamiltonian Dynamics (ETHD), is developed to capture quantum nuclear effects such as tunneling and zero-point energy through the coupling of multiple classical trajectories. The approach reformulates the classically mapped second-order Quantized Hamiltonian Dynamics (QHD-2) in terms of coupled classical trajectories. The method partially enforces the uncertainty principle and facilitates tunneling. The applicability of the method is demonstrated by studying the dynamics in symmetric double well and cubic metastable state potentials. The methodology is validated using exact quantum simulations and is compared to QHD-2. We illustrate its relationship to the rigorous Bohmian quantum potential approach, from which ETHD can be derived. Our simulations show a remarkable agreement of the ETHD calculation with the quantum results, suggesting that ETHD may be a simple and inexpensive way of including quantum nuclear effects in molecular dynamics simulations.

  1. Verification of a Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.; Roithmayr, Carlos; Toniolo, Matthew D.; Karlgaard, Christopher; Pamadi, Bandu N.

    2008-01-01

    This paper discusses the verification of the Constraint Force Equation (CFE) methodology and its implementation in the Program to Optimize Simulated Trajectories II (POST2) for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint; the second case involves two rigid bodies connected with a universal joint; and the third test case is that of Mach 7 separation of the Hyper-X vehicle. For the first two cases, the POST2/CFE solutions compared well with those obtained using industry standard benchmark codes, namely AUTOLEV and ADAMS. For the Hyper-X case, the POST2/CFE solutions were in reasonable agreement with the flight test data. The CFE implementation in POST2 facilitates the analysis and simulation of stage separation as an integral part of POST2 for seamless end-to-end simulations of launch vehicle trajectories.

  2. Selection of stationary phase particle geometry using X-ray computed tomography and computational fluid dynamics simulations.

    PubMed

    Schmidt, Irma; Minceva, Mirjana; Arlt, Wolfgang

    2012-02-17

    The X-ray computed tomography (CT) is used to determine local parameters related to the column packing homogeneity and hydrodynamics in columns packed with spherically and irregularly shaped particles of same size. The results showed that the variation of porosity and axial dispersion coefficient along the column axis is insignificant, compared to their radial distribution. The methodology of using the data attained by CT measurements to perform a CFD simulation of a batch separation of model binary mixtures, with different concentration and separation factors is demonstrated. The results of the CFD simulation study show that columns packed with spherically shaped particles provide higher yield in comparison to columns packed with irregularly shaped particles only below a certain value of the separation factor. The presented methodology can be used for selecting a suited packing material for a particular separation task. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. MM/PBSA analysis of molecular dynamics simulations of bovine beta-lactoglobulin: free energy gradients in conformational transitions?

    PubMed

    Fogolari, Federico; Moroni, Elisabetta; Wojciechowski, Marcin; Baginski, Maciej; Ragona, Laura; Molinari, Henriette

    2005-04-01

    The pH-driven opening and closure of beta-lactoglobulin EF loop, acting as a lid and closing the internal cavity of the protein, has been studied by molecular dynamics (MD) simulations and free energy calculations based on molecular mechanics/Poisson-Boltzmann (PB) solvent-accessible surface area (MM/PBSA) methodology. The forms above and below the transition pH differ presumably only in the protonation state of residue Glu89. MM/PBSA calculations are able to reproduce qualitatively the thermodynamics of the transition. The analysis of MD simulations using a combination of MM/PBSA methodology and the colony energy approach is able to highlight the driving forces implied in the transition. The analysis suggests that global rearrangements take place before the equilibrium local conformation is reached. This conclusion may bear general relevance to conformational transitions in all lipocalins and proteins in general. (c) 2005 Wiley-Liss, Inc.

  4. Additional confirmation of the validity of laboratory simulation of cloud radiances

    NASA Technical Reports Server (NTRS)

    Davis, J. M.; Cox, S. K.

    1986-01-01

    The results of a laboratory experiment are presented that provide additional verification of the methodology adopted for simulation of the radiances reflected from fields of optically thick clouds using the Cloud Field Optical Simulator (CFOS) at Colorado State University. The comparison of these data with their theoretically derived counterparts indicates that the crucial mechanism of cloud-to-cloud radiance field interaction is accurately simulated in the CFOS experiments and adds confidence to the manner in which the optical depth is scaled.

  5. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  6. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  7. Modeling Amorphous Microporous Polymers for CO2 Capture and Separations.

    PubMed

    Kupgan, Grit; Abbott, Lauren J; Hart, Kyle E; Colina, Coray M

    2018-06-13

    This review concentrates on the advances of atomistic molecular simulations to design and evaluate amorphous microporous polymeric materials for CO 2 capture and separations. A description of atomistic molecular simulations is provided, including simulation techniques, structural generation approaches, relaxation and equilibration methodologies, and considerations needed for validation of simulated samples. The review provides general guidelines and a comprehensive update of the recent literature (since 2007) to promote the acceleration of the discovery and screening of amorphous microporous polymers for CO 2 capture and separation processes.

  8. Probabilistic Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The

  9. Coordinated crew performance in commercial aircraft operations

    NASA Technical Reports Server (NTRS)

    Murphy, M. R.

    1977-01-01

    A specific methodology is proposed for an improved system of coding and analyzing crew member interaction. The complexity and lack of precision of many crew and task variables suggest the usefulness of fuzzy linguistic techniques for modeling and computer simulation of the crew performance process. Other research methodologies and concepts that have promise for increasing the effectiveness of research on crew performance are identified.

  10. RT 24 - Architecture, Modeling & Simulation, and Software Design

    DTIC Science & Technology

    2010-11-01

    focus on tool extensions (UPDM, SysML, SoaML, BPMN ) Leverage “best of breed” architecture methodologies Provide tooling to support the methodology DoDAF...Capability 10 Example: BPMN 11 DoDAF 2.0 MetaModel BPMN MetaModel Mapping SysML to DoDAF 2.0 12 DoDAF V2.0 Models OV-2 SysML Diagrams Requirement

  11. Finite element analysis of steady and transiently moving/rolling nonlinear viscoelastic structure. III - Impact/contact simulations

    NASA Technical Reports Server (NTRS)

    Nakajima, Yukio; Padovan, Joe

    1987-01-01

    In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.

  12. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    NASA Astrophysics Data System (ADS)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  13. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    NASA Astrophysics Data System (ADS)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  14. A novel methodology for building robust design rules by using design based metrology (DBM)

    NASA Astrophysics Data System (ADS)

    Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan

    2013-03-01

    This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.

  15. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 1: theoretical development

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The Saint-Venant equations are commonly used as the governing equations to solve for modeling the spatially varied unsteady flow in open channels. The presence of uncertainties in the channel or flow parameters renders these equations stochastic, thus requiring their solution in a stochastic framework in order to quantify the ensemble behavior and the variability of the process. While the Monte Carlo approach can be used for such a solution, its computational expense and its large number of simulations act to its disadvantage. This study proposes, explains, and derives a new methodology for solving the stochastic Saint-Venant equations in only one shot, without the need for a large number of simulations. The proposed methodology is derived by developing the nonlocal Lagrangian-Eulerian Fokker-Planck equation of the characteristic form of the stochastic Saint-Venant equations for an open-channel flow process, with an uncertain roughness coefficient. A numerical method for its solution is subsequently devised. The application and validation of this methodology are provided in a companion paper, in which the statistical results computed by the proposed methodology are compared against the results obtained by the Monte Carlo approach.

  16. High Resolution Imaging Using Phase Retrieval. Volume 2

    DTIC Science & Technology

    1991-10-01

    aberrations of the telescope. It will also correct aberrations due to atmospheric turbulence for a ground- based telescope, and can be used with several other...retrieval algorithm, based on the Ayers/Dainty blind deconvolution algorithm, was also developed. A new methodology for exploring the uniqueness of phase...Simulation Experiments ..................... 42 3.3.1 Initial Simulations with Noisy Modulus Data ..... 45 3.3.2 Simulations of a Space- Based Amplitude

  17. An Overview of the Greyscales Lethality Assessment Methodology

    DTIC Science & Technology

    2011-01-01

    code has already been integrated into the Weapon Systems Division MECA and DUEL missile engagement simulations. It can also be integrated into...incorporated into a variety of simulations. The code has already been integrated into the Weapon Systems Division MECA and DUEL missile engagement...capable of being incorporated into a variety of simulations. The code has already been integrated into the Weapon Systems Division MECA and DUEL missile

  18. An in vitro simulation method for the tribological assessment of complete natural hip joints

    PubMed Central

    Fisher, John; Williams, Sophie

    2017-01-01

    The use of hip joint simulators to evaluate the tribological performance of total hip replacements is widely reported in the literature, however, in vitro simulation studies investigating the tribology of the natural hip joint are limited with heterogeneous methodologies reported. An in vitro simulation system for the complete natural hip joint, enabling the acetabulum and femoral head to be positioned with different orientations whilst maintaining the correct joint centre of rotation, was successfully developed for this study. The efficacy of the simulation system was assessed by testing complete, matched natural porcine hip joints and porcine hip hemiarthroplasty joints in a pendulum friction simulator. The results showed evidence of biphasic lubrication, with a non-linear increase in friction being observed in both groups. Lower overall mean friction factor values in the complete natural joint group that increased at a lower rate over time, suggest that the exudation of fluid and transition to solid phase lubrication occurred more slowly in the complete natural hip joint compared to the hip hemiarthroplasty joint. It is envisaged that this methodology will be used to investigate morphological risk factors for developing hip osteoarthritis, as well as the effectiveness of early interventional treatments for degenerative hip disease. PMID:28886084

  19. How to assess the impact of a physical parameterization in simulations of moist convection?

    NASA Astrophysics Data System (ADS)

    Grabowski, Wojciech

    2017-04-01

    A numerical model capable in simulating moist convection (e.g., cloud-resolving model or large-eddy simulation model) consists of a fluid flow solver combined with required representations (i.e., parameterizations) of physical processes. The later typically include cloud microphysics, radiative transfer, and unresolved turbulent transport. Traditional approaches to investigate impacts of such parameterizations on convective dynamics involve parallel simulations with different parameterization schemes or with different scheme parameters. Such methodologies are not reliable because of the natural variability of a cloud field that is affected by the feedback between the physics and dynamics. For instance, changing the cloud microphysics typically leads to a different realization of the cloud-scale flow, and separating dynamical and microphysical impacts is difficult. This presentation will present a novel modeling methodology, the piggybacking, that allows studying the impact of a physical parameterization on cloud dynamics with confidence. The focus will be on the impact of cloud microphysics parameterization. Specific examples of the piggybacking approach will include simulations concerning the hypothesized deep convection invigoration in polluted environments, the validity of the saturation adjustment in modeling condensation in moist convection, and separation of physical impacts from statistical uncertainty in simulations applying particle-based Lagrangian microphysics, the super-droplet method.

  20. Accounting for Uncertainties in Strengths of SiC MEMS Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.

    2007-01-01

    A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.

  1. Mining data from hemodynamic simulations for generating prediction and explanation models.

    PubMed

    Bosnić, Zoran; Vračar, Petar; Radović, Milos D; Devedžić, Goran; Filipović, Nenad D; Kononenko, Igor

    2012-03-01

    One of the most common causes of human death is stroke, which can be caused by carotid bifurcation stenosis. In our work, we aim at proposing a prototype of a medical expert system that could significantly aid medical experts to detect hemodynamic abnormalities (increased artery wall shear stress). Based on the acquired simulated data, we apply several methodologies for1) predicting magnitudes and locations of maximum wall shear stress in the artery, 2) estimating reliability of computed predictions, and 3) providing user-friendly explanation of the model's decision. The obtained results indicate that the evaluated methodologies can provide a useful tool for the given problem domain. © 2012 IEEE

  2. Advanced power analysis methodology targeted to the optimization of a digital pixel readout chip design and its critical serial powering system

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Orfanelli, S.; Karagounis, M.; Hemperek, T.; Christiansen, J.; Placidi, P.

    2017-02-01

    A dedicated power analysis methodology, based on modern digital design tools and integrated with the VEPIX53 simulation framework developed within RD53 collaboration, is being used to guide vital choices for the design and optimization of the next generation ATLAS and CMS pixel chips and their critical serial powering circuit (shunt-LDO). Power consumption is studied at different stages of the design flow under different operating conditions. Significant effort is put into extensive investigations of dynamic power variations in relation with the decoupling seen by the powering network. Shunt-LDO simulations are also reported to prove the reliability at the system level.

  3. LES, DNS, and RANS for the Analysis of High-Speed Turbulent Reacting Flows

    NASA Technical Reports Server (NTRS)

    Colucci, P. J.; Jaberi, F. A.; Givi, P.

    1996-01-01

    A filtered density function (FDF) method suitable for chemically reactive flows is developed in the context of large eddy simulation. The advantage of the FDF methodology is its inherent ability to resolve subgrid scales (SGS) scalar correlations that otherwise have to be modeled. Because of the lack of robust models to accurately predict these correlations in turbulent reactive flows, simulations involving turbulent combustion are often met with a degree of skepticism. The FDF methodology avoids the closure problem associated with these terms and treats the reaction in an exact manner. The scalar FDF approach is particularly attractive since it can be coupled with existing hydrodynamic computational fluid dynamics (CFD) codes.

  4. Parameterizing the Spatial Markov Model From Breakthrough Curve Data Alone

    NASA Astrophysics Data System (ADS)

    Sherman, Thomas; Fakhari, Abbas; Miller, Savannah; Singha, Kamini; Bolster, Diogo

    2017-12-01

    The spatial Markov model (SMM) is an upscaled Lagrangian model that effectively captures anomalous transport across a diverse range of hydrologic systems. The distinct feature of the SMM relative to other random walk models is that successive steps are correlated. To date, with some notable exceptions, the model has primarily been applied to data from high-resolution numerical simulations and correlation effects have been measured from simulated particle trajectories. In real systems such knowledge is practically unattainable and the best one might hope for is breakthrough curves (BTCs) at successive downstream locations. We introduce a novel methodology to quantify velocity correlation from BTC data alone. By discretizing two measured BTCs into a set of arrival times and developing an inverse model, we estimate velocity correlation, thereby enabling parameterization of the SMM in studies where detailed Lagrangian velocity statistics are unavailable. The proposed methodology is applied to two synthetic numerical problems, where we measure all details and thus test the veracity of the approach by comparison of estimated parameters with known simulated values. Our results suggest that our estimated transition probabilities agree with simulated values and using the SMM with this estimated parameterization accurately predicts BTCs downstream. Our methodology naturally allows for estimates of uncertainty by calculating lower and upper bounds of velocity correlation, enabling prediction of a range of BTCs. The measured BTCs fall within the range of predicted BTCs. This novel method to parameterize the SMM from BTC data alone is quite parsimonious, thereby widening the SMM's practical applicability.

  5. Convergence studies of deterministic methods for LWR explicit reflector methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canepa, S.; Hursin, M.; Ferroukhi, H.

    2013-07-01

    The standard approach in modem 3-D core simulators, employed either for steady-state or transient simulations, is to use Albedo coefficients or explicit reflectors at the core axial and radial boundaries. In the latter approach, few-group homogenized nuclear data are a priori produced with lattice transport codes using 2-D reflector models. Recently, the explicit reflector methodology of the deterministic CASMO-4/SIMULATE-3 code system was identified to potentially constitute one of the main sources of errors for core analyses of the Swiss operating LWRs, which are all belonging to GII design. Considering that some of the new GIII designs will rely on verymore » different reflector concepts, a review and assessment of the reflector methodology for various LWR designs appeared as relevant. Therefore, the purpose of this paper is to first recall the concepts of the explicit reflector modelling approach as employed by CASMO/SIMULATE. Then, for selected reflector configurations representative of both GII and GUI designs, a benchmarking of the few-group nuclear data produced with the deterministic lattice code CASMO-4 and its successor CASMO-5, is conducted. On this basis, a convergence study with regards to geometrical requirements when using deterministic methods with 2-D homogenous models is conducted and the effect on the downstream 3-D core analysis accuracy is evaluated for a typical GII deflector design in order to assess the results against available plant measurements. (authors)« less

  6. Urban Flow and Pollutant Dispersion Simulation with Multi-scale coupling of Meteorological Model with Computational Fluid Dynamic Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yushi; Poh, Hee Joo

    2014-11-01

    The Computational Fluid Dynamics analysis has become increasingly important in modern urban planning in order to create highly livable city. This paper presents a multi-scale modeling methodology which couples Weather Research and Forecasting (WRF) Model with open source CFD simulation tool, OpenFOAM. This coupling enables the simulation of the wind flow and pollutant dispersion in urban built-up area with high resolution mesh. In this methodology meso-scale model WRF provides the boundary condition for the micro-scale CFD model OpenFOAM. The advantage is that the realistic weather condition is taken into account in the CFD simulation and complexity of building layout can be handled with ease by meshing utility of OpenFOAM. The result is validated against the Joint Urban 2003 Tracer Field Tests in Oklahoma City and there is reasonably good agreement between the CFD simulation and field observation. The coupling of WRF- OpenFOAM provide urban planners with reliable environmental modeling tool in actual urban built-up area; and it can be further extended with consideration of future weather conditions for the scenario studies on climate change impact.

  7. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  8. Simulated Crisis in Obstetric Anesthesia: Design and Evaluation of a Distance Education Presentation

    PubMed Central

    Murray, W. Bosseau; Schneider, Art; Underberg, Karin; Henry, Jody; Foster, Pat; Vaduva, Sorin; Venable, J. Clark; Shindel, Michelle

    2001-01-01

    Patient simulators are useful tools for training residents and all levels of medical personnel. Simulator usefulness, in small group sessions, is limited by the costs of training large numbers of people. We present an interrupted methodology designed to involve a large group at a location remote from the simulator. The goal was to enable the remote participants to take part in decision making while under time pressure. Two volunteers were chosen as hands-on participants while eighteen remaining anesthesiology residents observed from a lecture room via a closed circuit audio/video feed. A series of five crises in obstetric anesthesia was presented. After each crisis the simulation was paused and the observers were given three minutes to formulate a differential diagnosis and plan to be carried out. At the end of the session facilitators led a debriefing session with all participants. Surveys completed after the simulation indicated that most residents felt personally involved in the simulation, despite being physically removed from it. Surveys also showed that residents believed they learned more from this format than they would have from a lecture. Residents recalled an average of 3.4 crises two days after the session. This paper presents a model for distance education using a simulator and shows that residents believed remote, interrupted, interactive simulator training is valuable. The interrupted nature and involvement of remotely located peers differentiate this methodology from a passive viewing of a remote session. Further study is warranted to quantify the effectiveness of group and/or distance training with a simulator. PMID:27175412

  9. Validation of simulated earthquake ground motions based on evolution of intensity and frequency content

    USGS Publications Warehouse

    Rezaeian, Sanaz; Zhong, Peng; Hartzell, Stephen; Zareian, Farzin

    2015-01-01

    Simulated earthquake ground motions can be used in many recent engineering applications that require time series as input excitations. However, applicability and validation of simulations are subjects of debate in the seismological and engineering communities. We propose a validation methodology at the waveform level and directly based on characteristics that are expected to influence most structural and geotechnical response parameters. In particular, three time-dependent validation metrics are used to evaluate the evolving intensity, frequency, and bandwidth of a waveform. These validation metrics capture nonstationarities in intensity and frequency content of waveforms, making them ideal to address nonlinear response of structural systems. A two-component error vector is proposed to quantify the average and shape differences between these validation metrics for a simulated and recorded ground-motion pair. Because these metrics are directly related to the waveform characteristics, they provide easily interpretable feedback to seismologists for modifying their ground-motion simulation models. To further simplify the use and interpretation of these metrics for engineers, it is shown how six scalar key parameters, including duration, intensity, and predominant frequency, can be extracted from the validation metrics. The proposed validation methodology is a step forward in paving the road for utilization of simulated ground motions in engineering practice and is demonstrated using examples of recorded and simulated ground motions from the 1994 Northridge, California, earthquake.

  10. Inversion and Prediction of Consolidation Settlement Characteristics of the Fluvial Sediments Based on Void Ratio Variation in the Northern Modern Yellow River Subaqueous Delta, China

    NASA Astrophysics Data System (ADS)

    Liu, Xiao; Liu, Jie; Feng, Xiuli

    2018-06-01

    The modern Yellow River delta is formed near the estuary of the Yellow River with the characteristics of short formation time, efficient sedimentation rate and loose structure which make sediments prone to be compacted and consolidate under the geostatic stress and overburden stress. It is one of the key areas with land subsidence disasters in China, bringing a series of safety hazards to production and living. Based on the data of massive surface cores and ten drill holes ranging from 12 to 40 m obtained from the northern modern Yellow River subaqueous delta, the inversion method suitable for the calculation of consolidation settlement characteristics of the modern Yellow River subaqueous delta is discussed, and the consolidation settlement characteristics of the delta sediments are inversed and predicted in this paper. The actual void ratio of the delta sediments at the depth from 3 to 15 m shows a significant power function relationship with the depth, while the void ratio of the sediments below 15 m changes little with depth. The pre-consolidation settlement (from deposition to sampling) of the delta sediments is between 0.91 and 1.96 m, while the consolidation settlement of unit depth is between 9.6 and 14.0 cm m-1. The post-consolidation settlement (from sampling to stable) of the subaqueous delta sediments is between 0.65 and 1.56 m in the later stage, and the consolidation settlement of unit depth is between 7.6 and 13.1 cm m-1 under the overburden stress. The delta sediments with a buried depth of 3 to 7 m contribute the most to the possible consolidation settlement in the later stage.

  11. Systemic characterization and evaluation of particle packings as initial sets for discrete element simulations

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos Recarey; Cortés, Lucía Argüelles; Farias, Márcio Muniz de; Morales, Irvin Pablo Pérez; Valera, Roberto Roselló; Oñate, Eugenio

    2018-07-01

    A methodology that comprises several characterization properties for particle packings is proposed in this paper. The methodology takes into account factors such as dimension and shape of particles, space occupation, homogeneity, connectivity and isotropy, among others. This classification and integration of several properties allows to carry out a characterization process to systemically evaluate the particle packings in order to guarantee the quality of the initial meshes in discrete element simulations, in both the micro- and the macroscales. Several new properties were created, and improvements in existing ones are presented. Properties from other disciplines were adapted to be used in the evaluation of particle systems. The methodology allows to easily characterize media at the level of the microscale (continuous geometries—steels, rocks microstructures, etc., and discrete geometries) and the macroscale. A global, systemic and integral system for characterizing and evaluating particle sets, based on fuzzy logic, is presented. Such system allows researchers to have a unique evaluation criterion based on the aim of their research. Examples of applications are shown.

  12. Numerical Methodology for Coupled Time-Accurate Simulations of Primary and Secondary Flowpaths in Gas Turbines

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Athavale, M. M.; Hendricks, R. C.; Steinetz, B. M.

    2006-01-01

    Detailed information of the flow-fields in the secondary flowpaths and their interaction with the primary flows in gas turbine engines is necessary for successful designs with optimized secondary flow streams. Present work is focused on the development of a simulation methodology for coupled time-accurate solutions of the two flowpaths. The secondary flowstream is treated using SCISEAL, an unstructured adaptive Cartesian grid code developed for secondary flows and seals, while the mainpath flow is solved using TURBO, a density based code with capability of resolving rotor-stator interaction in multi-stage machines. An interface is being tested that links the two codes at the rim seal to allow data exchange between the two codes for parallel, coupled execution. A description of the coupling methodology and the current status of the interface development is presented. Representative steady-state solutions of the secondary flow in the UTRC HP Rig disc cavity are also presented.

  13. Systemic characterization and evaluation of particle packings as initial sets for discrete element simulations

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos Recarey; Cortés, Lucía Argüelles; Farias, Márcio Muniz de; Morales, Irvin Pablo Pérez; Valera, Roberto Roselló; Oñate, Eugenio

    2017-10-01

    A methodology that comprises several characterization properties for particle packings is proposed in this paper. The methodology takes into account factors such as dimension and shape of particles, space occupation, homogeneity, connectivity and isotropy, among others. This classification and integration of several properties allows to carry out a characterization process to systemically evaluate the particle packings in order to guarantee the quality of the initial meshes in discrete element simulations, in both the micro- and the macroscales. Several new properties were created, and improvements in existing ones are presented. Properties from other disciplines were adapted to be used in the evaluation of particle systems. The methodology allows to easily characterize media at the level of the microscale (continuous geometries—steels, rocks microstructures, etc., and discrete geometries) and the macroscale. A global, systemic and integral system for characterizing and evaluating particle sets, based on fuzzy logic, is presented. Such system allows researchers to have a unique evaluation criterion based on the aim of their research. Examples of applications are shown.

  14. Data-Driven Simulation-Enhanced Optimization of People-Based Print Production Service

    NASA Astrophysics Data System (ADS)

    Rai, Sudhendu

    This paper describes a systematic six-step data-driven simulation-based methodology for optimizing people-based service systems on a large distributed scale that exhibit high variety and variability. The methodology is exemplified through its application within the printing services industry where it has been successfully deployed by Xerox Corporation across small, mid-sized and large print shops generating over 250 million in profits across the customer value chain. Each step of the methodology consisting of innovative concepts co-development and testing in partnership with customers, development of software and hardware tools to implement the innovative concepts, establishment of work-process and practices for customer-engagement and service implementation, creation of training and infrastructure for large scale deployment, integration of the innovative offering within the framework of existing corporate offerings and lastly the monitoring and deployment of the financial and operational metrics for estimating the return-on-investment and the continual renewal of the offering are described in detail.

  15. Adaptive enhanced sampling with a path-variable for the simulation of protein folding and aggregation

    NASA Astrophysics Data System (ADS)

    Peter, Emanuel K.

    2017-12-01

    In this article, we present a novel adaptive enhanced sampling molecular dynamics (MD) method for the accelerated simulation of protein folding and aggregation. We introduce a path-variable L based on the un-biased momenta p and displacements dq for the definition of the bias s applied to the system and derive 3 algorithms: general adaptive bias MD, adaptive path-sampling, and a hybrid method which combines the first 2 methodologies. Through the analysis of the correlations between the bias and the un-biased gradient in the system, we find that the hybrid methodology leads to an improved force correlation and acceleration in the sampling of the phase space. We apply our method on SPC/E water, where we find a conservation of the average water structure. We then use our method to sample dialanine and the folding of TrpCage, where we find a good agreement with simulation data reported in the literature. Finally, we apply our methodologies on the initial stages of aggregation of a hexamer of Alzheimer's amyloid β fragment 25-35 (Aβ 25-35) and find that transitions within the hexameric aggregate are dominated by entropic barriers, while we speculate that especially the conformation entropy plays a major role in the formation of the fibril as a rate limiting factor.

  16. A DFT-Based Computational-Experimental Methodology for Synthetic Chemistry: Example of Application to the Catalytic Opening of Epoxides by Titanocene.

    PubMed

    Jaraíz, Martín; Enríquez, Lourdes; Pinacho, Ruth; Rubio, José E; Lesarri, Alberto; López-Pérez, José L

    2017-04-07

    A novel DFT-based Reaction Kinetics (DFT-RK) simulation approach, employed in combination with real-time data from reaction monitoring instrumentation (like UV-vis, FTIR, Raman, and 2D NMR benchtop spectrometers), is shown to provide a detailed methodology for the analysis and design of complex synthetic chemistry schemes. As an example, it is applied to the opening of epoxides by titanocene in THF, a catalytic system with abundant experimental data available. Through a DFT-RK analysis of real-time IR data, we have developed a comprehensive mechanistic model that opens new perspectives to understand previous experiments. Although derived specifically from the opening of epoxides, the prediction capabilities of the model, built on elementary reactions, together with its practical side (reaction kinetics simulations of real experimental conditions) make it a useful simulation tool for the design of new experiments, as well as for the conception and development of improved versions of the reagents. From the perspective of the methodology employed, because both the computational (DFT-RK) and the experimental (spectroscopic data) components can follow the time evolution of several species simultaneously, it is expected to provide a helpful tool for the study of complex systems in synthetic chemistry.

  17. De-individualized psychophysiological strain assessment during a flight simulation test—Validation of a space methodology

    NASA Astrophysics Data System (ADS)

    Johannes, Bernd; Salnitski, Vyacheslav; Soll, Henning; Rauch, Melina; Hoermann, Hans-Juergen

    For the evaluation of an operator's skill reliability indicators of work quality as well as of psychophysiological states during the work have to be considered. The herein presented methodology and measurement equipment were developed and tested in numerous terrestrial and space experiments using a simulation of a spacecraft docking on a space station. However, in this study the method was applied to a comparable terrestrial task—the flight simulator test (FST) used in the DLR selection procedure for ab initio pilot applicants for passenger airlines. This provided a large amount of data for a statistical verification of the space methodology. For the evaluation of the strain level of applicants during the FST psychophysiological measurements were used to construct a "psychophysiological arousal vector" (PAV) which is sensitive to various individual reaction patterns of the autonomic nervous system to mental load. Its changes and increases will be interpreted as "strain". In the first evaluation study, 614 subjects were analyzed. The subjects first underwent a calibration procedure for the assessment of their autonomic outlet type (AOT) and on the following day they performed the FST, which included three tasks and was evaluated by instructors applying well-established and standardized rating scales. This new method will possibly promote a wide range of other future applications in aviation and space psychology.

  18. System analysis through bond graph modeling

    NASA Astrophysics Data System (ADS)

    McBride, Robert Thomas

    2005-07-01

    Modeling and simulation form an integral role in the engineering design process. An accurate mathematical description of a system provides the design engineer the flexibility to perform trade studies quickly and accurately to expedite the design process. Most often, the mathematical model of the system contains components of different engineering disciplines. A modeling methodology that can handle these types of systems might be used in an indirect fashion to extract added information from the model. This research examines the ability of a modeling methodology to provide added insight into system analysis and design. The modeling methodology used is bond graph modeling. An investigation into the creation of a bond graph model using the Lagrangian of the system is provided. Upon creation of the bond graph, system analysis is performed. To aid in the system analysis, an object-oriented approach to bond graph modeling is introduced. A framework is provided to simulate the bond graph directly. Through object-oriented simulation of a bond graph, the information contained within the bond graph can be exploited to create a measurement of system efficiency. A definition of system efficiency is given. This measurement of efficiency is used in the design of different controllers of varying architectures. Optimal control of a missile autopilot is discussed within the framework of the calculated system efficiency.

  19. Adaptive enhanced sampling with a path-variable for the simulation of protein folding and aggregation.

    PubMed

    Peter, Emanuel K

    2017-12-07

    In this article, we present a novel adaptive enhanced sampling molecular dynamics (MD) method for the accelerated simulation of protein folding and aggregation. We introduce a path-variable L based on the un-biased momenta p and displacements dq for the definition of the bias s applied to the system and derive 3 algorithms: general adaptive bias MD, adaptive path-sampling, and a hybrid method which combines the first 2 methodologies. Through the analysis of the correlations between the bias and the un-biased gradient in the system, we find that the hybrid methodology leads to an improved force correlation and acceleration in the sampling of the phase space. We apply our method on SPC/E water, where we find a conservation of the average water structure. We then use our method to sample dialanine and the folding of TrpCage, where we find a good agreement with simulation data reported in the literature. Finally, we apply our methodologies on the initial stages of aggregation of a hexamer of Alzheimer's amyloid β fragment 25-35 (Aβ 25-35) and find that transitions within the hexameric aggregate are dominated by entropic barriers, while we speculate that especially the conformation entropy plays a major role in the formation of the fibril as a rate limiting factor.

  20. Reevaluation of tephra volumes for the 1982 eruption of El Chichón volcano, Mexico

    NASA Astrophysics Data System (ADS)

    Nathenson, M.; Fierstein, J.

    2012-12-01

    Reevaluation of tephra volumes for the 1982 eruption of El Chichón volcano, Mexico Manuel Nathenson and Judy Fierstein U.S. Geological Survey, 345 Middlefield Road MS-910, Menlo Park, CA 94025 In a recent numerical simulation of tephra transport and deposition for the 1982 eruption, Bonasia et al. (2012) used masses for the tephra layers (A-1, B, and C) based on the volume data of Carey and Sigurdsson (1986) calculated by the methodology of Rose et al. (1973). For reasons not clear, using the same methodology we obtained volumes for layers A-1 and B much less than those previously reported. For example, for layer A-1, Carey and Sigurdsson (1986) reported a volume of 0.60 km3, whereas we obtain a volume of 0.23 km3. Moreover, applying the more recent methodology of tephra-volume calculation (Pyle, 1989; Fierstein and Nathenson, 1992) and using the isopachs maps in Carey and Sigurdsson (1986), we calculate a total tephra volume of 0.52 km3 (A-1, 0.135; B, 0.125; and C, 0.26 km3). In contrast, Carey and Sigurdsson (1986) report a much larger total volume of 2.19 km3. Such disagreement not only reflects the differing methodologies, but we propose that the volumes calculated with the methodology of Pyle and of Fierstein and Nathenson—involving the use of straight lines on a log thickness versus square root of area plot—better represent the actual fall deposits. After measuring the areas for the isomass contours for the HAZMAPP and FALL3D simulations in Bonasia et al. (2012), we applied the Pyle-Fierstein and Nathenson methodology to calculate the tephra masses deposited on the ground. These masses from five of the simulations range from 70% to 110% of those reported by Carey and Sigurdsson (1986), whereas that for layer B in the HAZMAP calculation is 160%. In the Bonasia et al. (2012) study, the mass erupted by the volcano is a critical input used in the simulation to produce an ash cloud that deposits tephra on the ground. Masses on the ground (as calculated by us) for five of the simulations range from 20% to 46% of the masses used as simulation inputs, whereas that for layer B in the HAZMAP calculation is 74%. It is not clear why the percentages are so variable, nor why the output volumes are such small percentages of the input erupted mass. From our volume calculations, the masses on the ground from the simulations are factors of 2.3 to 10 times what was actually deposited. Given this finding from our reevaluation of volumes, the simulations appear to overestimate the hazards from eruptions of sizes that occurred at El Chichón. Bonasia, R., A. Costa, A. Folch, G. Macedonio, and L. Capra, (2012), Numerical simulation of tephra transport and deposition of the 1982 El Chichón eruption and implications for hazard assessment, J. Volc. Geotherm. Res., 231-232, 39-49. Carey, S. and H. Sigurdsson, (1986), The 1982 eruptions of El Chichon volcano, Mexico: Observations and numerical modelling of tephra-fall distribution, Bull. Volcanol., 48, 127-141. Fierstein, J., and M. Nathenson, (1992), Another look at the calculation of fallout tephra volumes, Bull. Volcanol., 54, 156-167. Pyle, D.M., (1989), The thickness, volume and grainsize of tephra fall deposits, Bull. Volcanol., 51, 1-15. Rose, W.I., Jr., S. Bonis, R.E. Stoiber, M. Keller, and T. Bickford, (1973), Studies of volcanic ash from two recent Central American eruptions, Bull. Volcanol., 37, 338-364.

  1. A Computational Methodology for Simulating Thermal Loss Testing of the Advanced Stirling Convertor

    NASA Technical Reports Server (NTRS)

    Reid, Terry V.; Wilson, Scott D.; Schifer, Nicholas A.; Briggs, Maxwell H.

    2012-01-01

    The U.S. Department of Energy (DOE) and Lockheed Martin Space Systems Company (LMSSC) have been developing the Advanced Stirling Radioisotope Generator (ASRG) for use as a power system for space science missions. This generator would use two highefficiency Advanced Stirling Convertors (ASCs), developed by Sunpower Inc. and NASA Glenn Research Center (GRC). The ASCs convert thermal energy from a radioisotope heat source into electricity. As part of ground testing of these ASCs, different operating conditions are used to simulate expected mission conditions. These conditions require achieving a particular operating frequency, hot end and cold end temperatures, and specified electrical power output for a given net heat input. In an effort to improve net heat input predictions, numerous tasks have been performed which provided a more accurate value for net heat input into the ASCs, including the use of multidimensional numerical models. Validation test hardware has also been used to provide a direct comparison of numerical results and validate the multi-dimensional numerical models used to predict convertor net heat input and efficiency. These validation tests were designed to simulate the temperature profile of an operating Stirling convertor and resulted in a measured net heat input of 244.4 W. The methodology was applied to the multi-dimensional numerical model which resulted in a net heat input of 240.3 W. The computational methodology resulted in a value of net heat input that was 1.7 percent less than that measured during laboratory testing. The resulting computational methodology and results are discussed.

  2. Replica exchange enveloping distribution sampling (RE-EDS): A robust method to estimate multiple free-energy differences from a single simulation.

    PubMed

    Sidler, Dominik; Schwaninger, Arthur; Riniker, Sereina

    2016-10-21

    In molecular dynamics (MD) simulations, free-energy differences are often calculated using free energy perturbation or thermodynamic integration (TI) methods. However, both techniques are only suited to calculate free-energy differences between two end states. Enveloping distribution sampling (EDS) presents an attractive alternative that allows to calculate multiple free-energy differences in a single simulation. In EDS, a reference state is simulated which "envelopes" the end states. The challenge of this methodology is the determination of optimal reference-state parameters to ensure equal sampling of all end states. Currently, the automatic determination of the reference-state parameters for multiple end states is an unsolved issue that limits the application of the methodology. To resolve this, we have generalised the replica-exchange EDS (RE-EDS) approach, introduced by Lee et al. [J. Chem. Theory Comput. 10, 2738 (2014)] for constant-pH MD simulations. By exchanging configurations between replicas with different reference-state parameters, the complexity of the parameter-choice problem can be substantially reduced. A new robust scheme to estimate the reference-state parameters from a short initial RE-EDS simulation with default parameters was developed, which allowed the calculation of 36 free-energy differences between nine small-molecule inhibitors of phenylethanolamine N-methyltransferase from a single simulation. The resulting free-energy differences were in excellent agreement with values obtained previously by TI and two-state EDS simulations.

  3. Estimating and validating ground-based timber harvesting production through computer simulation

    Treesearch

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  4. Possibilities of Particle Finite Element Methods in Industrial Forming Processes

    NASA Astrophysics Data System (ADS)

    Oliver, J.; Cante, J. C.; Weyler, R.; Hernandez, J.

    2007-04-01

    The work investigates the possibilities offered by the particle finite element method (PFEM) in the simulation of forming problems involving large deformations, multiple contacts, and new boundaries generation. The description of the most distinguishing aspects of the PFEM, and its application to simulation of representative forming processes, illustrate the proposed methodology.

  5. Audit Workplace Simulations as a Methodology to Increase Undergraduates' Awareness of Competences

    ERIC Educational Resources Information Center

    Bautista-Mesa, Rafael; Molina Sánchez, Horacio; Ramírez Sobrino, Jesús Nicolás

    2018-01-01

    This paper describes an audit workplace simulation and investigates its effects on students' perceptions of competences, required as important in the auditing industry. Within the competence-based teaching framework, this training activity involves cooperative learning as it combines first-undergraduate and senior students within one team. First,…

  6. Probabilistic inspection strategies for minimizing service failures

    NASA Technical Reports Server (NTRS)

    Brot, Abraham

    1994-01-01

    The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.

  7. Computer Series, 52: Scientific Exploration with a Microcomputer: Simulations for Nonscientists.

    ERIC Educational Resources Information Center

    Whisnant, David M.

    1984-01-01

    Describes two simulations, written for Apple II microcomputers, focusing on scientific methodology. The first is based on the tendency of colloidal iron in high concentrations to stick to fish gills and cause breathing difficulties. The second, modeled after the dioxin controversy, examines a hypothetical chemical thought to cause cancer. (JN)

  8. A Simulation of Lean Manufacturing: The Lean Lemonade Tycoon 2

    ERIC Educational Resources Information Center

    Ncube, Lisa B.

    2010-01-01

    This article discusses the functions and effectiveness of games and simulations in the learning processes, in particular as an experiential learning methodology. The application of the game Lemonade Tycoon in the development of lean manufacturing concepts is described. This article addresses the use of the game to teach the principles of lean…

  9. 75 FR 55619 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-13

    ... Numerical Simulations Risk Management Methodology September 7, 2010. Pursuant to Section 19(b)(1) of the... for incorporation in the System for Theoretical Analysis and Numerical Simulations (``STANS'') risk... ETFs \\3\\ in the STANS margin calculation process.\\4\\ When OCC began including common stock and ETFs in...

  10. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, A Simulation-Based Method, Part - I: Literature Review (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THE STUDY INVESTIGATES THE APPLICATION OF SIMULATION ALONG WITH FIELD OBSERVATIONS FOR ESTIMATION OF EXCLUSIVE LEFT-TURN SATURATION FLOW RATE AND CAPACITY. THE ENTIRE RESEARCH HAS COVERED THE FOLLOWING PRINCIPAL SUBJECTS: (1) A SATURATION FLOW MODEL ...

  11. Designing Online Problem Representation Engine for Conceptual Change

    ERIC Educational Resources Information Center

    Lee, Chwee Beng; Ling, Keck Voon

    2012-01-01

    Purpose: This paper aims to describe the web-based scaffold dynamic simulation system (PRES-on) designed for pre-service teachers. Design/methodology/approach: The paper describes the initial design of a web-based scaffold dynamic simulation system (PRES-on) as a cognitive tool for learners to represent problems. For the widespread use of the…

  12. A Novel Multi-scale Simulation Strategy for Turbulent Reacting Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Sutherland C.

    In this project, a new methodology was proposed to bridge the gap between Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES). This novel methodology, titled Lattice-Based Multiscale Simulation (LBMS), creates a lattice structure of One-Dimensional Turbulence (ODT) models. This model has been shown to capture turbulent combustion with high fidelity by fully resolving interactions between turbulence and diffusion. By creating a lattice of ODT models, which are then coupled, LBMS overcomes the shortcomings of ODT, which are its inability to capture large scale three dimensional flow structures. However, by spacing these lattices significantly apart, LBMS can avoid the cursemore » of dimensionality that creates untenable computational costs associated with DNS. This project has shown that LBMS is capable of reproducing statistics of isotropic turbulent flows while coarsening the spacing between lines significantly. It also investigates and resolves issues that arise when coupling ODT lines, such as flux reconstruction perpendicular to a given ODT line, preservation of conserved quantities when eddies cross a course cell volume and boundary condition application. Robust parallelization is also investigated.« less

  13. Subject specific finite element modeling of periprosthetic femoral fracture using element deactivation to simulate bone failure.

    PubMed

    Miles, Brad; Kolos, Elizabeth; Walter, William L; Appleyard, Richard; Shi, Angela; Li, Qing; Ruys, Andrew J

    2015-06-01

    Subject-specific finite element (FE) modeling methodology could predict peri-prosthetic femoral fracture (PFF) for cementless hip arthoplasty in the early postoperative period. This study develops methodology for subject-specific finite element modeling by using the element deactivation technique to simulate bone failure and validate with experimental testing, thereby predicting peri-prosthetic femoral fracture in the early postoperative period. Material assignments for biphasic and triphasic models were undertaken. Failure modeling with the element deactivation feature available in ABAQUS 6.9 was used to simulate a crack initiation and propagation in the bony tissue based upon a threshold of fracture strain. The crack mode for the biphasic models was very similar to the experimental testing crack mode, with a similar shape and path of the crack. The fracture load is sensitive to the friction coefficient at the implant-bony interface. The development of a novel technique to simulate bone failure by element deactivation of subject-specific finite element models could aid prediction of fracture load in addition to fracture risk characterization for PFF. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  14. Large Eddy Simulation of Entropy Generation in a Turbulent Mixing Layer

    NASA Astrophysics Data System (ADS)

    Sheikhi, Reza H.; Safari, Mehdi; Hadi, Fatemeh

    2013-11-01

    Entropy transport equation is considered in large eddy simulation (LES) of turbulent flows. The irreversible entropy generation in this equation provides a more general description of subgrid scale (SGS) dissipation due to heat conduction, mass diffusion and viscosity effects. A new methodology is developed, termed the entropy filtered density function (En-FDF), to account for all individual entropy generation effects in turbulent flows. The En-FDF represents the joint probability density function of entropy, frequency, velocity and scalar fields within the SGS. An exact transport equation is developed for the En-FDF, which is modeled by a system of stochastic differential equations, incorporating the second law of thermodynamics. The modeled En-FDF transport equation is solved by a Lagrangian Monte Carlo method. The methodology is employed to simulate a turbulent mixing layer involving transport of passive scalars and entropy. Various modes of entropy generation are obtained from the En-FDF and analyzed. Predictions are assessed against data generated by direct numerical simulation (DNS). The En-FDF predictions are in good agreements with the DNS data.

  15. Baccalaureate nursing students' perspectives of peer tutoring in simulation laboratory, a Q methodology study.

    PubMed

    Li, Ting; Petrini, Marcia A; Stone, Teresa E

    2018-02-01

    The study aim was to identify the perceived perspectives of baccalaureate nursing students toward the peer tutoring in the simulation laboratory. Insight into the nursing students' experiences and baseline data related to their perception of peer tutoring will assist to improve nursing education. Q methodology was applied to explore the students' perspectives of peer tutoring in the simulation laboratory. A convenience P-sample of 40 baccalaureate nursing students was used. Fifty-eight selected Q statements from each participant were classified into the shape of a normal distribution using an 11-point bipolar scale form with a range from -5 to +5. PQ Method software analyzed the collected data. Three discrete factors emerged: Factor I ("Facilitate or empower" knowledge acquisition), Factor II ("Safety Net" Support environment), and Factor III ("Mentoring" learn how to learn). The findings of this study support and indicate that peer tutoring is an effective supplementary strategy to promote baccalaureate students' knowledge acquisition, establishing a supportive safety net and facilitating their abilities to learn in the simulation laboratory. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A Bayesian method for using simulator data to enhance human error probabilities assigned by existing HRA methods

    DOE PAGES

    Groth, Katrina M.; Smith, Curtis L.; Swiler, Laura P.

    2014-04-05

    In the past several years, several international agencies have begun to collect data on human performance in nuclear power plant simulators [1]. This data provides a valuable opportunity to improve human reliability analysis (HRA), but there improvements will not be realized without implementation of Bayesian methods. Bayesian methods are widely used in to incorporate sparse data into models in many parts of probabilistic risk assessment (PRA), but Bayesian methods have not been adopted by the HRA community. In this article, we provide a Bayesian methodology to formally use simulator data to refine the human error probabilities (HEPs) assigned by existingmore » HRA methods. We demonstrate the methodology with a case study, wherein we use simulator data from the Halden Reactor Project to update the probability assignments from the SPAR-H method. The case study demonstrates the ability to use performance data, even sparse data, to improve existing HRA methods. Furthermore, this paper also serves as a demonstration of the value of Bayesian methods to improve the technical basis of HRA.« less

  17. The Simulation of Daily Temperature Time Series from GCM Output. Part II: Sensitivity Analysis of an Empirical Transfer Function Methodology.

    NASA Astrophysics Data System (ADS)

    Winkler, Julie A.; Palutikof, Jean P.; Andresen, Jeffrey A.; Goodess, Clare M.

    1997-10-01

    Empirical transfer functions have been proposed as a means for `downscaling' simulations from general circulation models (GCMs) to the local scale. However, subjective decisions made during the development of these functions may influence the ensuing climate scenarios. This research evaluated the sensitivity of a selected empirical transfer function methodology to 1) the definition of the seasons for which separate specification equations are derived, 2) adjustments for known departures of the GCM simulations of the predictor variables from observations, 3) the length of the calibration period, 4) the choice of function form, and 5) the choice of predictor variables. A modified version of the Climatological Projection by Model Statistics method was employed to generate control (1 × CO2) and perturbed (2 × CO2) scenarios of daily maximum and minimum temperature for two locations with diverse climates (Alcantarilla, Spain, and Eau Claire, Michigan). The GCM simulations used in the scenario development were from the Canadian Climate Centre second-generation model (CCC GCMII).Variations in the downscaling methodology were found to have a statistically significant impact on the 2 × CO2 climate scenarios, even though the 1 × CO2 scenarios for the different transfer function approaches were often similar. The daily temperature scenarios for Alcantarilla and Eau Claire were most sensitive to the decision to adjust for deficiencies in the GCM simulations, the choice of predictor variables, and the seasonal definitions used to derive the functions (i.e., fixed seasons, floating seasons, or no seasons). The scenarios were less sensitive to the choice of function form (i.e., linear versus nonlinear) and to an increase in the length of the calibration period.The results of Part I, which identified significant departures of the CCC GCMII simulations of two candidate predictor variables from observations, together with those presented here in Part II, 1) illustrate the importance of detailed comparisons of observed and GCM 1 × CO2 series of candidate predictor variables as an initial step in impact analysis, 2) demonstrate that decisions made when developing the transfer functions can have a substantial influence on the 2 × CO2 scenarios and their interpretation, 3) highlight the uncertainty in the appropriate criteria for evaluating transfer function approaches, and 4) suggest that automation of empirical transfer function methodologies is inappropriate because of differences in the performance of transfer functions between sites and because of spatial differences in the GCM's ability to adequately simulate the predictor variables used in the functions.

  18. A Flight-Calibrated Methodology for Determination of Cassini Thruster On-Times for Reaction Wheel Biases

    NASA Technical Reports Server (NTRS)

    Sarani, Siamak

    2010-01-01

    This paper describes a methodology for accurate and flight-calibrated determination of the on-times of the Cassini spacecraft Reaction Control System (RCS) thrusters, without any form of dynamic simulation, for the reaction wheel biases. The hydrazine usage and the delta V vector in body frame are also computed from the respective thruster on-times. The Cassini spacecraft, the largest and most complex interplanetary spacecraft ever built, continues to undertake ambitious and unique scientific observations of planet Saturn, Titan, Enceladus, and other moons of Saturn. In order to maintain a stable attitude during the course of its mission, this three-axis stabilized spacecraft uses two different control systems: the RCS and the reaction wheel assembly control system. The RCS is used to execute a commanded spacecraft slew, to maintain three-axis attitude control, control spacecraft's attitude while performing science observations with coarse pointing requirements, e.g. during targeted low-altitude Titan and Enceladus flybys, bias the momentum of reaction wheels, and to perform RCS-based orbit trim maneuvers. The use of RCS often imparts undesired delta V on the spacecraft. The Cassini navigation team requires accurate predictions of the delta V in spacecraft coordinates and inertial frame resulting from slews using RCS thrusters and more importantly from reaction wheel bias events. It is crucial for the Cassini spacecraft attitude control and navigation teams to be able to, quickly but accurately, predict the hydrazine usage and delta V for various reaction wheel bias events without actually having to spend time and resources simulating the event in flight software-based dynamic simulation or hardware-in-the-loop simulation environments. The methodology described in this paper, and the ground software developed thereof, are designed to provide just that. This methodology assumes a priori knowledge of thrust magnitudes and thruster pulse rise and tail-off time constants for eight individual attitude control thrusters, the spacecraft's wet mass and its center of mass location, and a few other key parameters.

  19. Evaluative methodology for comprehensive water quality management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, H. L.

    Computer-based evaluative methodologies have been developed to provide for the analysis of coupled phenomena associated with natural resource comprehensive planning requirements. Provisions for planner/computer interaction have been included. Each of the simulation models developed is described in terms of its coded procedures. An application of the models for water quality management planning is presented; and the data requirements for each of the models are noted.

  20. Computational Chemistry Toolkit for Energetic Materials Design

    DTIC Science & Technology

    2006-11-01

    industry are aggressively engaged in efforts to develop multiscale modeling and simulation methodologies to model and analyze complex phenomena across...energetic materials design. It is hoped that this toolkit will evolve into a collection of well-integrated multiscale modeling methodologies...Experimenta Theoreticala This Work 1-5-Diamino-4- methyl- tetrazolium nitrate 8.4 41.7 47.5 1-5-Diamino-4- methyl- tetrazolium azide 138.1 161.6

  1. Pressure Distribution and Performance Impacts of Aerospike Nozzles on Rotating Detonation Engines

    DTIC Science & Technology

    2017-06-01

    design methodology at both on- and off-design conditions anticipated throughout the combustion cycle. Steady-state, non -reacting computational fluid...operation. Therefore, the nozzle contour was designed using a traditional, steady-state design methodology at both on- and off-design conditions...anticipated throughout the combustion cycle. Steady-state, non -reacting computational fluid dynamics (CFD) simulations were performed on various nozzle

  2. Five-Junction Solar Cell Optimization Using Silvaco Atlas

    DTIC Science & Technology

    2017-09-01

    experimental sources [1], [4], [6]. f. Numerical Method The method selected for solving the non -linear equations that make up the simulation can be...and maximize efficiency. Optimization of solar cell efficiency is carried out via nearly orthogonal balanced design of experiments methodology . Silvaco...Optimization of solar cell efficiency is carried out via nearly orthogonal balanced design of experiments methodology . Silvaco ATLAS is utilized to

  3. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  4. Assessing the changes of groundwater recharge / irrigation water use between SRI and traditional irrigation schemes in Central Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Kai; Jang, Cheng-Shin; Tsai, Cheng-Bin

    2015-04-01

    To respond to agricultural water shortage impacted by climate change without affecting rice yield in the future, the application of water-saving irrigation, such as SRI methodology, is considered to be adopted in rice-cultivation in Taiwan. However, the flooded paddy fields could be considered as an important source of groundwater recharge in Central Taiwan. The water-saving benefit of this new methodology and its impact on the reducing of groundwater recharge should be integrally assessed in this area. The objective of this study was to evaluate the changes of groundwater recharge/ irrigation water use between the SRI and traditional irrigation schemes (continuous irrigation, rotational irrigation). An experimental paddy field located in the proximal area of the Choushui River alluvial fan (the largest groundwater pumping region in Taiwan) was chosen as the study area. The 3-D finite element groundwater model (FEMWATER) with the variable boundary condition analog functions, was applied in simulating groundwater recharge process and amount under traditional irrigation schemes and SRI methodology. The use of effective rainfall was taken into account or not in different simulation scenarios for each irrigation scheme. The simulation results showed that there were no significant variations of infiltration rate in the use of effective rainfall or not, but the low soil moisture setting in deep soil layers resulted in higher infiltration rate. Taking the use of effective rainfall into account, the average infiltration rate for continuous irrigation, rotational irrigation, and SRI methodology in the first crop season of 2013 were 4.04 mm/day, 4.00 mm/day and 3.92 mm/day, respectively. The groundwater recharge amount of SRI methodology was slightly lower than those of traditional irrigation schemes, reducing 4% and 2% compared with continuous irrigation and rotational irrigation, respectively. The field irrigation requirement amount of SRI methodology was significantly lower than those of traditional irrigation schemes, saving 35% and 9% compared with continuous irrigation and rotational irrigation, respectively. The SRI methodology significantly improved water-saving benefit compared with the disadvantage of reducing groundwater recharge. The results could be used as a basis for the relevant government agency to formulate the integral water resource management strategies in this area. Keywords: SRI, Paddy field, Infiltration, Groundwater recharge

  5. Discrete event simulation modelling of patient service management with Arena

    NASA Astrophysics Data System (ADS)

    Guseva, Elena; Varfolomeyeva, Tatyana; Efimova, Irina; Movchan, Irina

    2018-05-01

    This paper describes the simulation modeling methodology aimed to aid in solving the practical problems of the research and analysing the complex systems. The paper gives the review of a simulation platform sand example of simulation model development with Arena 15.0 (Rockwell Automation).The provided example of the simulation model for the patient service management helps to evaluate the workload of the clinic doctors, determine the number of the general practitioners, surgeons, traumatologists and other specialized doctors required for the patient service and develop recommendations to ensure timely delivery of medical care and improve the efficiency of the clinic operation.

  6. Boosting flood warning schemes with fast emulator of detailed hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Bellos, V.; Carbajal, J. P.; Leitao, J. P.

    2017-12-01

    Floods are among the most destructive catastrophic events and their frequency has incremented over the last decades. To reduce flood impact and risks, flood warning schemes are installed in flood prone areas. Frequently, these schemes are based on numerical models which quickly provide predictions of water levels and other relevant observables. However, the high complexity of flood wave propagation in the real world and the need of accurate predictions in urban environments or in floodplains hinders the use of detailed simulators. This sets the difficulty, we need fast predictions that meet the accuracy requirements. Most physics based detailed simulators although accurate, will not fulfill the speed demand. Even if High Performance Computing techniques are used (the magnitude of required simulation time is minutes/hours). As a consequence, most flood warning schemes are based in coarse ad-hoc approximations that cannot take advantage a detailed hydrodynamic simulation. In this work, we present a methodology for developing a flood warning scheme using an Gaussian Processes based emulator of a detailed hydrodynamic model. The methodology consists of two main stages: 1) offline stage to build the emulator; 2) online stage using the emulator to predict and generate warnings. The offline stage consists of the following steps: a) definition of the critical sites of the area under study, and the specification of the observables to predict at those sites, e.g. water depth, flow velocity, etc.; b) generation of a detailed simulation dataset to train the emulator; c) calibration of the required parameters (if measurements are available). The online stage is carried on using the emulator to predict the relevant observables quickly, and the detailed simulator is used in parallel to verify key predictions of the emulator. The speed gain given by the emulator allows also to quantify uncertainty in predictions using ensemble methods. The above methodology is applied in real world scenario.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toltz, A; Seuntjens, J; Hoesl, M

    Purpose: With the aim of reducing acute esophageal radiation toxicity in pediatric patients receiving craniospinal irradiation (CSI), we investigated the implementation of an in-vivo, adaptive proton therapy range verification methodology. Simulation experiments and in-phantom measurements were conducted to validate the range verification technique for this clinical application. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification for a prostate treatment case by correlating properties of the detector signal to the water equivalent path length (WEPL). We propose to extend the methodology to verify range distal to the vertebralmore » body for pediatric CSI cases by placing this small volume dosimeter in the esophagus of the anesthetized patient immediately prior to treatment. A set of calibration measurements was performed to establish a time signal to WEPL fit for a “scout” beam in a solid water phantom. Measurements are compared against Monte Carlo simulation in GEANT4 using the Tool for Particle Simulation (TOPAS). Results: Measurements with the diode array in a spread out Bragg peak of 14 cm modulation width and 15 cm range (177 MeV passively scattered beam) in solid water were successfully validated against proton fluence rate simulations in TOPAS. The resulting calibration curve allows for a sensitivity analysis of detector system response with dose rate in simulation and with individual diode position through simulation on patient CT data. Conclusion: Feasibility has been shown for the application of this range verification methodology to pediatric CSI. An in-vivo measurement to determine the WEPL to the inner surface of the esophagus will allow for personalized adjustment of the treatment plan to ensure sparing of the esophagus while confirming target coverage. A Toltz acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less

  8. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.

    1984-01-01

    A generic computer simulation for manipulator systems (ROBSIM) was implemented and the specific technologies necessary to increase the role of automation in various missions were developed. The specific items developed are: (1) capability for definition of a manipulator system consisting of multiple arms, load objects, and an environment; (2) capability for kinematic analysis, requirements analysis, and response simulation of manipulator motion; (3) postprocessing options such as graphic replay of simulated motion and manipulator parameter plotting; (4) investigation and simulation of various control methods including manual force/torque and active compliances control; (5) evaluation and implementation of three obstacle avoidance methods; (6) video simulation and edge detection; and (7) software simulation validation.

  9. Large-Eddy Simulation of Wind-Plant Aerodynamics: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Churchfield, M. J.; Lee, S.; Moriarty, P. J.

    In this work, we present results of a large-eddy simulation of the 48 multi-megawatt turbines composing the Lillgrund wind plant. Turbulent inflow wind is created by performing an atmospheric boundary layer precursor simulation and turbines are modeled using a rotating, variable-speed actuator line representation. The motivation for this work is that few others have done wind plant large-eddy simulations with a substantial number of turbines, and the methods for carrying out the simulations are varied. We wish to draw upon the strengths of the existing simulations and our growing atmospheric large-eddy simulation capability to create a sound methodology for performingmore » this type of simulation. We have used the OpenFOAM CFD toolbox to create our solver.« less

  10. Development of an adaptive failure detection and identification system for detecting aircraft control element failures

    NASA Technical Reports Server (NTRS)

    Bundick, W. Thomas

    1990-01-01

    A methodology for designing a failure detection and identification (FDI) system to detect and isolate control element failures in aircraft control systems is reviewed. An FDI system design for a modified B-737 aircraft resulting from this methodology is also reviewed, and the results of evaluating this system via simulation are presented. The FDI system performed well in a no-turbulence environment, but it experienced an unacceptable number of false alarms in atmospheric turbulence. An adaptive FDI system, which adjusts thresholds and other system parameters based on the estimated turbulence level, was developed and evaluated. The adaptive system performed well over all turbulence levels simulated, reliably detecting all but the smallest magnitude partially-missing-surface failures.

  11. Process optimization of rolling for zincked sheet technology using response surface methodology and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ji, Liang-Bo; Chen, Fang

    2017-07-01

    Numerical simulation and intelligent optimization technology were adopted for rolling and extrusion of zincked sheet. By response surface methodology (RSM), genetic algorithm (GA) and data processing technology, an efficient optimization of process parameters for rolling of zincked sheet was investigated. The influence trend of roller gap, rolling speed and friction factor effects on reduction rate and plate shortening rate were analyzed firstly. Then a predictive response surface model for comprehensive quality index of part was created using RSM. Simulated and predicted values were compared. Through genetic algorithm method, the optimal process parameters for the forming of rolling were solved. They were verified and the optimum process parameters of rolling were obtained. It is feasible and effective.

  12. Probabilistic lifetime strength of aerospace materials via computational simulation

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Keating, Jerome P.; Lovelace, Thomas B.; Bast, Callie C.

    1991-01-01

    The results of a second year effort of a research program are presented. The research included development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic phenomenological constitutive relationship, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects of primitive variables. These primitive variables often originate in the environment and may include stress from loading, temperature, chemical, or radiation attack. This multifactor interaction constitutive equation is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the constitutive equation using actual experimental materials data together with the multiple linear regression of that data.

  13. Variability-aware compact modeling and statistical circuit validation on SRAM test array

    NASA Astrophysics Data System (ADS)

    Qiao, Ying; Spanos, Costas J.

    2016-03-01

    Variability modeling at the compact transistor model level can enable statistically optimized designs in view of limitations imposed by the fabrication technology. In this work we propose a variability-aware compact model characterization methodology based on stepwise parameter selection. Transistor I-V measurements are obtained from bit transistor accessible SRAM test array fabricated using a collaborating foundry's 28nm FDSOI technology. Our in-house customized Monte Carlo simulation bench can incorporate these statistical compact models; and simulation results on SRAM writability performance are very close to measurements in distribution estimation. Our proposed statistical compact model parameter extraction methodology also has the potential of predicting non-Gaussian behavior in statistical circuit performances through mixtures of Gaussian distributions.

  14. Validation Methodology to Allow Simulated Peak Reduction and Energy Performance Analysis of Residential Building Envelope with Phase Change Materials: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.

    2012-08-01

    Phase change materials (PCM) represent a potential technology to reduce peak loads and HVAC energy consumption in residential buildings. This paper summarizes NREL efforts to obtain accurate energy simulations when PCMs are modeled in residential buildings: the overall methodology to verify and validate Conduction Finite Difference (CondFD) and PCM algorithms in EnergyPlus is presented in this study. It also shows preliminary results of three residential building enclosure technologies containing PCM: PCM-enhanced insulation, PCM impregnated drywall and thin PCM layers. The results are compared based on predicted peak reduction and energy savings using two algorithms in EnergyPlus: the PCM and Conductionmore » Finite Difference (CondFD) algorithms.« less

  15. Accelerated Aging of the M119 Simulator

    NASA Technical Reports Server (NTRS)

    Bixon, Eric R.

    2000-01-01

    This paper addresses the storage requirement, shelf life, and the reliability of M119 Whistling Simulator. Experimental conditions have been determined and the data analysis has been completed for the accelerated testing of the system. A general methodology to evaluate the shelf life of the system as a function of the storage time, temperature, and relative humidity is discussed.

  16. Agent-Based Simulation of Learning Dissemination in a Project-Based Learning Context Considering the Human Aspects

    ERIC Educational Resources Information Center

    Seman, Laio Oriel; Hausmann, Romeu; Bezerra, Eduardo Augusto

    2018-01-01

    Contribution: This paper presents the "PBL classroom model," an agent-based simulation (ABS) that allows testing of several scenarios of a project-based learning (PBL) application by considering different levels of soft-skills, and students' perception of the methodology. Background: While the community has made great advances in…

  17. Bolometer Simulation Using SPICE

    NASA Technical Reports Server (NTRS)

    Jones, Hollis H.; Aslam, Shahid; Lakew, Brook

    2004-01-01

    A general model is presented that assimilates the thermal and electrical properties of the bolometer - this block model demonstrates the Electro-Thermal Feedback (ETF) effect on the bolometers performance. This methodology is used to construct a SPICE model that by way of analogy combines the thermal and electrical phenomena into one simulation session. The resulting circuit diagram is presented and discussed.

  18. Forest growth and timber quality: crown models and simulation methods for sustainable forest management

    Treesearch

    Dennis P. Dykstra; Robert A. Monserud

    2009-01-01

    The purpose of the international conference from which these proceedings are drawn was to explore relationships between forest management activities and timber quality. Sessions were organized to explore models and simulation methodologies that contribute to an understanding of tree development over time and the ways that management and harvesting activities can...

  19. How Mode of Delivery Affects Comprehension of an Operations Management Simulation: Online vs Face-to-Face Classrooms

    ERIC Educational Resources Information Center

    Riley, Jason M.; Ellegood, William A.; Solomon, Stanislaus; Baker, Jerrine

    2017-01-01

    Purpose: This study aims to understand how mode of delivery, online versus face-to-face, affects comprehension when teaching operations management concepts via a simulation. Conceptually, the aim is to identify factors that influence the students' ability to learn and retain new concepts. Design/methodology/approach: Leveraging Littlefield…

  20. An Old Tool Reexamined: Using the Star Power Simulation to Teach Social Inequality

    ERIC Educational Resources Information Center

    Prince, Barbara F.; Kozimor-King, Michele Lee; Steele, Jennifer

    2015-01-01

    This study examined the effectiveness of the Star Power simulation for teaching stratification and inequality to students of the net generation. The data for this study were obtained through the use of survey methodology and content analysis of 126 course papers from introductory sociology classes. Papers were analyzed for identification and…

  1. 3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications

    ERIC Educational Resources Information Center

    Pesaresi, Cristano; Van Der Schee, Joop; Pavia, Davide

    2017-01-01

    The project "3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications" has been devised with the intention to deal with the demand for research, innovation and applicative methodology on the part of the international programme, requiring concrete results to increase the capacity to know, anticipate…

  2. Evaluation of Calipers II: Using Simulations to Assess Complex Learning Site Visit Findings. CRESST Report 821

    ERIC Educational Resources Information Center

    Matrundola, Deborah La Torre; Chang, Sandy; Herman, Joan

    2012-01-01

    The purpose of these case studies was to examine the ways technology and professional development supported the use of the SimScientists assessment systems. Qualitative research methodology was used to provide narrative descriptions of six classes implementing simulation-based assessments for either the topic of Ecosystems or Atoms and Molecules.…

  3. Force Measurement on the GLAST Delta II Flight

    NASA Technical Reports Server (NTRS)

    Gordon, Scott; Kaufman, Daniel

    2009-01-01

    This viewgraph presentation reviews the interface force measurement at spacecraft separation of GLAST Delta II. The contents include: 1) Flight Force Measurement (FFM) Background; 2) Team Members; 3) GLAST Mission Overview; 4) Methodology Development; 5) Ground Test Validation; 6) Flight Data; 7) Coupled Loads Simulation (VCLA & Reconstruction); 8) Basedrive Simulation; 9) Findings; and 10) Summary and Conclusions.

  4. A Methodology to Assess the Benefit of Operational or Tactic Adjustments to Reduce Marine Corps Fuel Consumption

    DTIC Science & Technology

    2015-12-01

    simulation M777A2 howitzer MAGTF Marine Air-Ground Task Force MANA Map Aware Non-Uniform Automata MCWL Marine Corps Warfighting Lab MEB Marine...met. The project developed a Map Aware Non-Uniform Automata (MANA) model for each SPMAGTF size. The MANA models simulated the maneuver and direct

  5. Procedure to Generate the MPACT Multigroup Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog

    The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the light water reactor. The objective of this document is focused on reviewing the current procedure to generate the MPACT multigroup library. Detailed methodologies and procedures are included in this document for further discussion to improve the MPACT multigroup library.

  6. Supply Chain Simulator: A Scenario-Based Educational Tool to Enhance Student Learning

    ERIC Educational Resources Information Center

    Siddiqui, Atiq; Khan, Mehmood; Akhtar, Sohail

    2008-01-01

    Simulation-based educational products are excellent set of illustrative tools that proffer features like visualization of the dynamic behavior of a real system, etc. Such products have great efficacy in education and are known to be one of the first-rate student centered learning methodologies. These products allow students to practice skills such…

  7. Teaching Lean Six Sigma within a Supply Chain Context: The Airplane Supply Chain Simulation

    ERIC Educational Resources Information Center

    Ellis, Scott C.; Goldsby, Thomas J.; Bailey, Ana M.; Oh, Jae-Young

    2014-01-01

    Lean six sigma is a management methodology that firms can employ to achieve substantial improvement in supply chain performance. However, few pedagogical exercises facilitate students' use of a comprehensive set of lean six sigma principles within a supply chain context. We describe the Airplane Supply Chain Simulation that helps students…

  8. Solid rocket booster internal flow analysis by highly accurate adaptive computational methods

    NASA Technical Reports Server (NTRS)

    Huang, C. Y.; Tworzydlo, W.; Oden, J. T.; Bass, J. M.; Cullen, C.; Vadaketh, S.

    1991-01-01

    The primary objective of this project was to develop an adaptive finite element flow solver for simulating internal flows in the solid rocket booster. Described here is a unique flow simulator code for analyzing highly complex flow phenomena in the solid rocket booster. New methodologies and features incorporated into this analysis tool are described.

  9. Processes and Factors Underlying Adolescent Males' Attitudes and Decision-Making in Relation to an Unplanned Pregnancy

    ERIC Educational Resources Information Center

    Condon, John T.; Corkindale, Carolyn J.; Russell, Alan; Quinlivan, Julie A.

    2006-01-01

    This research examined adolescent males' decision-making when confronted with a hypothetical unplanned pregnancy in a sexual partner. An innovative methodology, involving a computerized simulation game was utilized with 386 Australian males (mean age of 15 years). Data were gathered from responses made during the simulation, and questionnaires…

  10. Hardware and software reliability estimation using simulations

    NASA Technical Reports Server (NTRS)

    Swern, Frederic L.

    1994-01-01

    The simulation technique is used to explore the validation of both hardware and software. It was concluded that simulation is a viable means for validating both hardware and software and associating a reliability number with each. This is useful in determining the overall probability of system failure of an embedded processor unit, and improving both the code and the hardware where necessary to meet reliability requirements. The methodologies were proved using some simple programs, and simple hardware models.

  11. Evaluation of probable maximum snow accumulation: Development of a methodology for climate change studies

    NASA Astrophysics Data System (ADS)

    Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick

    2016-06-01

    Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.

  12. Evaluation of glucose controllers in virtual environment: methodology and sample application.

    PubMed

    Chassin, Ludovic J; Wilinska, Malgorzata E; Hovorka, Roman

    2004-11-01

    Adaptive systems to deliver medical treatment in humans are safety-critical systems and require particular care in both the testing and the evaluation phase, which are time-consuming, costly, and confounded by ethical issues. The objective of the present work is to develop a methodology to test glucose controllers of an artificial pancreas in a simulated (virtual) environment. A virtual environment comprising a model of the carbohydrate metabolism and models of the insulin pump and the glucose sensor is employed to simulate individual glucose excursions in subjects with type 1 diabetes. The performance of the control algorithm within the virtual environment is evaluated by considering treatment and operational scenarios. The developed methodology includes two dimensions: testing in relation to specific life style conditions, i.e. fasting, post-prandial, and life style (metabolic) disturbances; and testing in relation to various operating conditions, i.e. expected operating conditions, adverse operating conditions, and system failure. We define safety and efficacy criteria and describe the measures to be taken prior to clinical testing. The use of the methodology is exemplified by tuning and evaluating a model predictive glucose controller being developed for a wearable artificial pancreas focused on fasting conditions. Our methodology to test glucose controllers in a virtual environment is instrumental in anticipating the results of real clinical tests for different physiological conditions and for different operating conditions. The thorough testing in the virtual environment reduces costs and speeds up the development process.

  13. Assessing Consequential Scenarios in a Complex Operational Environment Using Agent Based Simulation

    DTIC Science & Technology

    2017-03-16

    RWISE) 93 5.1.5 Conflict Modeling, Planning, and Outcomes Experimentation Program (COMPOEX) 94 5.1.6 Joint Non -Kinetic Effects Model (JNEM)/Athena... experimental design and testing. 4.3.8 Types and Attributes of Agent-Based Model Design Patterns Using the aforementioned ABM flowchart design methodology ...speed, or flexibility during tactical US Army wargaming. The report considers methodologies to improve analysis of the human domain, identifies

  14. Staff Training for Business Process Improvement: The Benefit of Role-Plays in the Case of KreditSim

    ERIC Educational Resources Information Center

    Borner, Rene; Moormann, Jurgen; Wang, Minhong

    2012-01-01

    Purpose: The paper aims to explore staff's experience with role-plays using the example of training bank employees in Six Sigma as a major methodology for business process improvement. Design/methodology/approach: The research is based on a case study. A role-play, KreditSim, is used to simulate a loan approval process that has to be improved by…

  15. A generalized methodology for identification of threshold for HRU delineation in SWAT model

    NASA Astrophysics Data System (ADS)

    M, J.; Sudheer, K.; Chaubey, I.; Raj, C.

    2016-12-01

    The distributed hydrological model, Soil and Water Assessment Tool (SWAT) is a comprehensive hydrologic model widely used for making various decisions. The simulation accuracy of the distributed hydrological model differs due to the mechanism involved in the subdivision of the watershed. Soil and Water Assessment Tool (SWAT) considers sub-dividing the watershed and the sub-basins into small computing units known as 'hydrologic response units (HRU). The delineation of HRU is done based on unique combinations of land use, soil types, and slope within the sub-watersheds, which are not spatially defined. The computations in SWAT are done at HRU level and are then aggregated up to the sub-basin outlet, which is routed through the stream system. Generally, the HRUs are delineated by considering a threshold percentage of land use, soil and slope are to be given by the modeler to decrease the computation time of the model. The thresholds constrain the minimum area for constructing an HRU. In the current HRU delineation practice in SWAT, the land use, soil and slope of the watershed within a sub-basin, which is less than the predefined threshold, will be surpassed by the dominating land use, soil and slope, and introduce some level of ambiguity in the process simulations in terms of inappropriate representation of the area. But the loss of information due to variation in the threshold values depends highly on the purpose of the study. Therefore this research studies the effects of threshold values of HRU delineation on the hydrological modeling of SWAT on sediment simulations and suggests guidelines for selecting the appropriate threshold values considering the sediment simulation accuracy. The preliminary study was done on Illinois watershed by assigning different thresholds for land use and soil. A general methodology was proposed for identifying an appropriate threshold for HRU delineation in SWAT model that considered computational time and accuracy of the simulation. The methodology can be adopted for identifying an appropriate threshold for SWAT model simulation in any watershed with a single simulation of the model with a zero-zero threshold.

  16. Simulation of earthquake caused building damages for the development of fast reconnaissance techniques

    NASA Astrophysics Data System (ADS)

    Schweier, C.; Markus, M.; Steinle, E.

    2004-04-01

    Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.

  17. A 3D, fully Eulerian, VOF-based solver to study the interaction between two fluids and moving rigid bodies using the fictitious domain method

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Raessi, Mehdi

    2016-04-01

    We present a three-dimensional (3D) and fully Eulerian approach to capturing the interaction between two fluids and moving rigid structures by using the fictitious domain and volume-of-fluid (VOF) methods. The solid bodies can have arbitrarily complex geometry and can pierce the fluid-fluid interface, forming contact lines. The three-phase interfaces are resolved and reconstructed by using a VOF-based methodology. Then, a consistent scheme is employed for transporting mass and momentum, allowing for simulations of three-phase flows of large density ratios. The Eulerian approach significantly simplifies numerical resolution of the kinematics of rigid bodies of complex geometry and with six degrees of freedom. The fluid-structure interaction (FSI) is computed using the fictitious domain method. The methodology was developed in a message passing interface (MPI) parallel framework accelerated with graphics processing units (GPUs). The computationally intensive solution of the pressure Poisson equation is ported to GPUs, while the remaining calculations are performed on CPUs. The performance and accuracy of the methodology are assessed using an array of test cases, focusing individually on the flow solver and the FSI in surface-piercing configurations. Finally, an application of the proposed methodology in simulations of the ocean wave energy converters is presented.

  18. Power system market implementation in a deregulated environment

    NASA Astrophysics Data System (ADS)

    Silva, Carlos

    2000-10-01

    The opening of the power system markets (also known as deregulation) gives rise to issues never seen before by this industry. One of the most important is the control of information about the cost of generation. Information that used to be common knowledge is now kept private by the new agents of the system (generator companies, distribution companies, etc.). Data such as the generator cost functions are now known only by the owning companies. The result is a new system consisting of a group of independent firms seeking the maximization of their own profit. There have been many proposals to organize the new market in an economically efficient manner. Nevertheless, the uniqueness of the electric power system has prevented the development of such a market. This thesis evaluates the most common proposals using simulations in an auction setting. In addition a new methodology is proposed based on mechanism design, a common technique in economics, that solves some of the practical problems of power system markets (such as the management of limited transmission capacity). In this methodology, when each company acts in its best interest, the outcome is efficient in spite of the information problem cited above. This new methodology, along with the existing methodologies, are tested using simulation and analyzed to create a clear comparison of benefits and disadvantages.

  19. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    This report presents the results of a fourth year effort of a research program, conducted for NASA-LeRC by the University of Texas at San Antonio (UTSA). The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subject to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 has been analyzed using the developed methodology.

  20. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep, and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    The results of a fourth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA) are presented. The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue, or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation was randomized and is included in the computer program, PROMISC. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  1. Alice’s Delirium: A Theatre-based Simulation Scenario for Nursing

    PubMed Central

    Posner, Glenn D

    2018-01-01

    As an educational methodology, simulation has been used by nursing education at the academic level for numerous years and has started to gain traction in the onboarding education and professional development of practicing nurses. Simulation allows the learner to apply knowledge and skills in a safe environment where mistakes and learning can happen without an impact on patient safety. The development of a simulation scenario to demonstrate the benefits of simulation education methodologies to a large group of nurse educators was requested by nursing education leadership at The Ottawa Hospital (TOH). Since the demonstration of this scenario in the fall of 2016, there has been significant uptake and adaptation of this particular scenario within the nursing education departments of TOH. Originally written to be used with a simulated patient (SP), “Alice” has since been adapted to be used with a hi-fidelity manikin within an inpatient surgery department continuing professional development (CPD) program for practicing nurses, orientation for nurses to a level 2 trauma unit and at the corporate level of nursing orientation using an SP. Therefore, this scenario is applicable to nurses practicing in an area of inpatient surgery at varying levels, from novice to expert. It could easily be adapted for use with medicine nursing education programs. The case presented in this technical report is of the simulation scenario used for the inpatient surgery CPD program. Varying adaptations of the case are included in the appendices. PMID:29872592

  2. Development of an Efficient CFD Model for Nuclear Thermal Thrust Chamber Assembly Design

    NASA Technical Reports Server (NTRS)

    Cheng, Gary; Ito, Yasushi; Ross, Doug; Chen, Yen-Sen; Wang, Ten-See

    2007-01-01

    The objective of this effort is to develop an efficient and accurate computational methodology to predict both detailed thermo-fluid environments and global characteristics of the internal ballistics for a hypothetical solid-core nuclear thermal thrust chamber assembly (NTTCA). Several numerical and multi-physics thermo-fluid models, such as real fluid, chemically reacting, turbulence, conjugate heat transfer, porosity, and power generation, were incorporated into an unstructured-grid, pressure-based computational fluid dynamics solver as the underlying computational methodology. The numerical simulations of detailed thermo-fluid environment of a single flow element provide a mechanism to estimate the thermal stress and possible occurrence of the mid-section corrosion of the solid core. In addition, the numerical results of the detailed simulation were employed to fine tune the porosity model mimic the pressure drop and thermal load of the coolant flow through a single flow element. The use of the tuned porosity model enables an efficient simulation of the entire NTTCA system, and evaluating its performance during the design cycle.

  3. The Future Impact of Wind on BPA Power System Ancillary Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Lu, Shuai; McManus, Bart

    Wind power is growing in a very fast pace as an alternative generating resource. As the ratio of wind power over total system capacity increases, the impact of wind on various system aspects becomes significant. This paper presents a methodology to study the future impact of wind on BPA power system ancillary services including load following and regulation. Existing approaches for similar analysis include dispatch model simulation and standard deviation evaluation. The methodology proposed in this paper uses historical data and stochastic processes to simulate the load balancing processes in BPA power system. Then capacity, ramp rate and ramp durationmore » characteristics are extracted from the simulation results, and load following and regulation requirements are calculated accordingly. It mimics the actual power system operations therefore the results can be more realistic yet the approach is convenient to perform. Further, the ramp rate and ramp duration data obtained from the analysis can be used to evaluate generator response or maneuverability and energy requirement, respectively, additional to the capacity requirement.« less

  4. A methodology to determine the elastic moduli of crystals by matching experimental and simulated lattice strain pole figures using discrete harmonics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wielewski, Euan; Boyce, Donald E.; Park, Jun-Sang

    Determining reliable single crystal material parameters for complex polycrystalline materials is a significant challenge for the materials community. In this work, a novel methodology for determining those parameters is outlined and successfully applied to the titanium alloy, Ti-6Al-4V. Utilizing the results from a lattice strain pole figure experiment conducted at the Cornell High Energy Synchrotron Source, an iterative approach is used to optimize the single crystal elastic moduli by comparing experimental and simulated lattice strain pole figures at discrete load steps during a uniaxial tensile test. Due to the large number of unique measurements taken during the experiments, comparisons weremore » made by using the discrete spherical harmonic modes of both the experimental and simulated lattice strain pole figures, allowing the complete pole figures to be used to determine the single crystal elastic moduli. (C) 2016 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.« less

  5. Proof-of-Concept Study for Uncertainty Quantification and Sensitivity Analysis using the BRL Shaped-Charge Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Justin Matthew

    These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less

  6. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    NASA Astrophysics Data System (ADS)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  7. UWB Tracking Algorithms: AOA and TDOA

    NASA Technical Reports Server (NTRS)

    Ni, Jianjun David; Arndt, D.; Ngo, P.; Gross, J.; Refford, Melinda

    2006-01-01

    Ultra-Wideband (UWB) tracking prototype systems are currently under development at NASA Johnson Space Center for various applications on space exploration. For long range applications, a two-cluster Angle of Arrival (AOA) tracking method is employed for implementation of the tracking system; for close-in applications, a Time Difference of Arrival (TDOA) positioning methodology is exploited. Both AOA and TDOA are chosen to utilize the achievable fine time resolution of UWB signals. This talk presents a brief introduction to AOA and TDOA methodologies. The theoretical analysis of these two algorithms reveal the affecting parameters impact on the tracking resolution. For the AOA algorithm, simulations show that a tracking resolution less than 0.5% of the range can be achieved with the current achievable time resolution of UWB signals. For the TDOA algorithm used in close-in applications, simulations show that the (sub-inch) high tracking resolution is achieved with a chosen tracking baseline configuration. The analytical and simulated results provide insightful guidance for the UWB tracking system design.

  8. Use of measurement theory for operationalization and quantification of psychological constructs in systems dynamics modelling

    NASA Astrophysics Data System (ADS)

    Fitkov-Norris, Elena; Yeghiazarian, Ara

    2016-11-01

    The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.

  9. A VERSATILE SHARP INTERFACE IMMERSED BOUNDARY METHOD FOR INCOMPRESSIBLE FLOWS WITH COMPLEX BOUNDARIES

    PubMed Central

    Mittal, R.; Dong, H.; Bozkurttas, M.; Najjar, F.M.; Vargas, A.; von Loebbecke, A.

    2010-01-01

    A sharp interface immersed boundary method for simulating incompressible viscous flow past three-dimensional immersed bodies is described. The method employs a multi-dimensional ghost-cell methodology to satisfy the boundary conditions on the immersed boundary and the method is designed to handle highly complex three-dimensional, stationary, moving and/or deforming bodies. The complex immersed surfaces are represented by grids consisting of unstructured triangular elements; while the flow is computed on non-uniform Cartesian grids. The paper describes the salient features of the methodology with special emphasis on the immersed boundary treatment for stationary and moving boundaries. Simulations of a number of canonical two- and three-dimensional flows are used to verify the accuracy and fidelity of the solver over a range of Reynolds numbers. Flow past suddenly accelerated bodies are used to validate the solver for moving boundary problems. Finally two cases inspired from biology with highly complex three-dimensional bodies are simulated in order to demonstrate the versatility of the method. PMID:20216919

  10. A method to identify and analyze biological programs through automated reasoning

    PubMed Central

    Yordanov, Boyan; Dunn, Sara-Jane; Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen

    2016-01-01

    Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich, but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090

  11. Simulations of the National Ignition Facility Opacity Sample

    NASA Astrophysics Data System (ADS)

    Martin, M. E.; London, R. A.; Heeter, R. F.; Dodd, E. S.; Devolder, B. G.; Opachich, Y. P.; Liedahl, D. A.; Perry, T. S.

    2017-10-01

    A platform to study the opacity of high temperature materials at the National Ignition Facility has been developed. Experiments to study the opacity of materials relevant to inertial confinement fusion and stellar astrophysics are being conducted. The initial NIF experiments are focused on reaching the same plasma conditions (T >150 eV and Ne >= 7 ×1021 cm-3) , for iron, as those achieved in previous experiments at Sandia National Laboratories' (SNL) Z-facility which have shown discrepancies between opacity theory and experiment. We developed a methodology, using 1D HYDRA simulations, to study the effects of tamper thickness on the conditions of iron-magnesium samples. We heat the sample using an x-ray drive from 2D LASNEX hohlraum simulations. We also use this methodology to predict sample uniformity and expansion for comparison with experimental data. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Lawrence Livermore National Security, LLC.

  12. Simulation of olive grove gross primary production by the combination of ground and multi-sensor satellite data

    NASA Astrophysics Data System (ADS)

    Brilli, L.; Chiesi, M.; Maselli, F.; Moriondo, M.; Gioli, B.; Toscano, P.; Zaldei, A.; Bindi, M.

    2013-08-01

    We developed and tested a methodology to estimate olive (Olea europaea L.) gross primary production (GPP) combining ground and multi-sensor satellite data. An eddy-covariance station placed in an olive grove in central Italy provided carbon and water fluxes over two years (2010-2011), which were used as reference to evaluate the performance of a GPP estimation methodology based on a Monteith type model (modified C-Fix) and driven by meteorological and satellite (NDVI) data. A major issue was related to the consideration of the two main olive grove components, i.e. olive trees and inter-tree ground vegetation: this issue was addressed by the separate simulation of carbon fluxes within the two ecosystem layers, followed by their recombination. In this way the eddy covariance GPP measurements were successfully reproduced, with the exception of two periods that followed tillage operations. For these periods measured GPP could be approximated by considering synthetic NDVI values which simulated the expected response of inter-tree ground vegetation to tillages.

  13. Creep force modelling for rail traction vehicles based on the Fastsim algorithm

    NASA Astrophysics Data System (ADS)

    Spiryagin, Maksym; Polach, Oldrich; Cole, Colin

    2013-11-01

    The evaluation of creep forces is a complex task and their calculation is a time-consuming process for multibody simulation (MBS). A methodology of creep forces modelling at large traction creepages has been proposed by Polach [Creep forces in simulations of traction vehicles running on adhesion limit. Wear. 2005;258:992-1000; Influence of locomotive tractive effort on the forces between wheel and rail. Veh Syst Dyn. 2001(Suppl);35:7-22] adapting his previously published algorithm [Polach O. A fast wheel-rail forces calculation computer code. Veh Syst Dyn. 1999(Suppl);33:728-739]. The most common method for creep force modelling used by software packages for MBS of running dynamics is the Fastsim algorithm by Kalker [A fast algorithm for the simplified theory of rolling contact. Veh Syst Dyn. 1982;11:1-13]. However, the Fastsim code has some limitations which do not allow modelling the creep force - creep characteristic in agreement with measurements for locomotives and other high-power traction vehicles, mainly for large traction creep at low-adhesion conditions. This paper describes a newly developed methodology based on a variable contact flexibility increasing with the ratio of the slip area to the area of adhesion. This variable contact flexibility is introduced in a modification of Kalker's code Fastsim by replacing the constant Kalker's reduction factor, widely used in MBS, by a variable reduction factor together with a slip-velocity-dependent friction coefficient decreasing with increasing global creepage. The proposed methodology is presented in this work and compared with measurements for different locomotives. The modification allows use of the well recognised Fastsim code for simulation of creep forces at large creepages in agreement with measurements without modifying the proven modelling methodology at small creepages.

  14. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  15. Development of a Probabilistic Assessment Methodology for Evaluation of Carbon Dioxide Storage

    USGS Publications Warehouse

    Burruss, Robert A.; Brennan, Sean T.; Freeman, P.A.; Merrill, Matthew D.; Ruppert, Leslie F.; Becker, Mark F.; Herkelrath, William N.; Kharaka, Yousif K.; Neuzil, Christopher E.; Swanson, Sharon M.; Cook, Troy A.; Klett, Timothy R.; Nelson, Philip H.; Schenk, Christopher J.

    2009-01-01

    This report describes a probabilistic assessment methodology developed by the U.S. Geological Survey (USGS) for evaluation of the resource potential for storage of carbon dioxide (CO2) in the subsurface of the United States as authorized by the Energy Independence and Security Act (Public Law 110-140, 2007). The methodology is based on USGS assessment methodologies for oil and gas resources created and refined over the last 30 years. The resource that is evaluated is the volume of pore space in the subsurface in the depth range of 3,000 to 13,000 feet that can be described within a geologically defined storage assessment unit consisting of a storage formation and an enclosing seal formation. Storage assessment units are divided into physical traps (PTs), which in most cases are oil and gas reservoirs, and the surrounding saline formation (SF), which encompasses the remainder of the storage formation. The storage resource is determined separately for these two types of storage. Monte Carlo simulation methods are used to calculate a distribution of the potential storage size for individual PTs and the SF. To estimate the aggregate storage resource of all PTs, a second Monte Carlo simulation step is used to sample the size and number of PTs. The probability of successful storage for individual PTs or the entire SF, defined in this methodology by the likelihood that the amount of CO2 stored will be greater than a prescribed minimum, is based on an estimate of the probability of containment using present-day geologic knowledge. The report concludes with a brief discussion of needed research data that could be used to refine assessment methodologies for CO2 sequestration.

  16. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  17. Controlling allosteric networks in proteins

    NASA Astrophysics Data System (ADS)

    Dokholyan, Nikolay

    2013-03-01

    We present a novel methodology based on graph theory and discrete molecular dynamics simulations for delineating allosteric pathways in proteins. We use this methodology to uncover the structural mechanisms responsible for coupling of distal sites on proteins and utilize it for allosteric modulation of proteins. We will present examples where inference of allosteric networks and its rewiring allows us to ``rescue'' cystic fibrosis transmembrane conductance regulator (CFTR), a protein associated with fatal genetic disease cystic fibrosis. We also use our methodology to control protein function allosterically. We design a novel protein domain that can be inserted into identified allosteric site of target protein. Using a drug that binds to our domain, we alter the function of the target protein. We successfully tested this methodology in vitro, in living cells and in zebrafish. We further demonstrate transferability of our allosteric modulation methodology to other systems and extend it to become ligh-activatable.

  18. A Real-Time Telemetry Simulator of the IUS Spacecraft

    NASA Technical Reports Server (NTRS)

    Drews, Michael E.; Forman, Douglas A.; Baker, Damon M.; Khazoyan, Louis B.; Viazzo, Danilo

    1998-01-01

    A real-time telemetry simulator of the IUS spacecraft has recently entered operation to train Flight Control Teams for the launch of the AXAF telescope from the Shuttle. The simulator has proven to be a successful higher fidelity implementation of its predecessor, while affirming the rapid development methodology used in its design. Although composed of COTS hardware and software, the system simulates the full breadth of the mission: Launch, Pre-Deployment-Checkout, Burn Sequence, and AXAF/IUS separation. Realism is increased through patching the system into the operations facility to simulate IUS telemetry, Shuttle telemetry, and the Tracking Station link (commands and status message).

  19. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, La Tonya; Malczynski, Leonard

    A Powersim Studio implementation of the system dynamics’ ‘Molecules of Structure’. The original implementation was in Ventana’s Vensim language by James Hines. The molecules are fundamental constructs of the system dynamics simulation methodology.

  1. Methodological challenges to bridge the gap between regional climate and hydrology models

    NASA Astrophysics Data System (ADS)

    Bozhinova, Denica; José Gómez-Navarro, Juan; Raible, Christoph; Felder, Guido

    2017-04-01

    The frequency and severity of floods worldwide, together with their impacts, are expected to increase under climate change scenarios. It is therefore very important to gain insight into the physical mechanisms responsible for such events in order to constrain the associated uncertainties. Model simulations of the climate and hydrological processes are important tools that can provide insight in the underlying physical processes and thus enable an accurate assessment of the risks. Coupled together, they can provide a physically consistent picture that allows to assess the phenomenon in a comprehensive way. However, climate and hydrological models work at different temporal and spatial scales, so there are a number of methodological challenges that need to be carefully addressed. An important issue pertains the presence of biases in the simulation of precipitation. Climate models in general, and Regional Climate models (RCMs) in particular, are affected by a number of systematic biases that limit their reliability. In many studies, prominently the assessment of changes due to climate change, such biases are minimised by applying the so-called delta approach, which focuses on changes disregarding absolute values that are more affected by biases. However, this approach is not suitable in this scenario, as the absolute value of precipitation, rather than the change, is fed into the hydrological model. Therefore, bias has to be previously removed, being this a complex matter where various methodologies have been proposed. In this study, we apply and discuss the advantages and caveats of two different methodologies that correct the simulated precipitation to minimise differences with respect an observational dataset: a linear fit (FIT) of the accumulated distributions and Quantile Mapping (QM). The target region is Switzerland, and therefore the observational dataset is provided by MeteoSwiss. The RCM is the Weather Research and Forecasting model (WRF), driven at the boundaries by the Community Earth System Model (CESM). The raw simulation driven by CESM exhibit prominent biases that stand out in the evolution of the annual cycle and demonstrate that the correction of biases is mandatory in this type of studies, rather than a minor correction that might be neglected. The simulation spans the period 1976 - 2005, although the application of the correction is carried out on a daily basis. Both methods lead to a corrected field of precipitation that respects the temporal evolution of the simulated precipitation, at the same time that mimics the distribution of precipitation according to the one in the observations. Due to the nature of the two methodologies, there are important differences between the products of both corrections, that lead to dataset with different properties. FIT is generally more accurate regarding the reproduction of the tails of the distribution, i.e. extreme events, whereas the nature of QM renders it a general-purpose correction whose skill is equally distributed across the full distribution of precipitation, including central values.

  2. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  3. VEEP - Vehicle Economy, Emissions, and Performance program

    NASA Technical Reports Server (NTRS)

    Heimburger, D. A.; Metcalfe, M. A.

    1977-01-01

    VEEP is a general-purpose discrete event simulation program being developed to study the performance, fuel economy, and exhaust emissions of a vehicle modeled as a collection of its separate components. It is written in SIMSCRIPT II.5. The purpose of this paper is to present the design methodology, describe the simulation model and its components, and summarize the preliminary results. Topics include chief programmer team concepts, the SDDL design language, program portability, user-oriented design, the program's user command syntax, the simulation procedure, and model validation.

  4. Characterization of the faulted behavior of digital computers and fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Miner, Paul S.

    1989-01-01

    A development status evaluation is presented for efforts conducted at NASA-Langley since 1977, toward the characterization of the latent fault in digital fault-tolerant systems. Attention is given to the practical, high speed, generalized gate-level logic system simulator developed, as well as to the validation methodology used for the simulator, on the basis of faultable software and hardware simulations employing a prototype MIL-STD-1750A processor. After validation, latency tests will be performed.

  5. Gamma ray observatory dynamics simulator in Ada (GRODY)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This experiment involved the parallel development of dynamics simulators for the Gamma Ray Observatory in both FORTRAN and Ada for the purpose of evaluating the applicability of Ada to the NASA/Goddard Space Flight Center's flight dynamics environment. The experiment successfully demonstrated that Ada is a viable, valuable technology for use in this environment. In addition to building a simulator, the Ada team evaluated training approaches, developed an Ada methodology appropriate to the flight dynamics environment, and established a baseline for evaluating future Ada projects.

  6. Multiscale simulation of molecular processes in cellular environments.

    PubMed

    Chiricotto, Mara; Sterpone, Fabio; Derreumaux, Philippe; Melchionna, Simone

    2016-11-13

    We describe the recent advances in studying biological systems via multiscale simulations. Our scheme is based on a coarse-grained representation of the macromolecules and a mesoscopic description of the solvent. The dual technique handles particles, the aqueous solvent and their mutual exchange of forces resulting in a stable and accurate methodology allowing biosystems of unprecedented size to be simulated.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).

  7. Brownian dynamics simulation of protein diffusion in crowded environments

    NASA Astrophysics Data System (ADS)

    Mereghetti, Paolo; Wade, Rebecca C.

    2013-02-01

    High macromolecular concentrations are a distinguishing feature of living organisms. Understanding how the high concentration of solutes affects the dynamic properties of biological macromolecules is fundamental for the comprehension of biological processes in living systems. We first describe the development of a Brownian dynamics simulation methodology to investigate the dynamic and structural properties of protein solutions using atomic-detail protein structures. We then discuss insights obtained from applying this approach to simulation of solutions of a range of types of proteins.

  8. Ballistic Simulation Method for Lithium Ion Batteries (BASIMLIB) Using Thick Shell Composites (TSC) in LS-DYNA

    DTIC Science & Technology

    2016-08-04

    BAllistic SImulation Method for Lithium Ion Batteries (BASIMLIB) using Thick Shell Composites (TSC) in LS-DYNA Venkatesh Babu, Dr. Matt Castanier, Dr...Objective • Objective and focus of this work is to develop a – Robust simulation methodology to model lithium - ion based batteries in its module and full...unlimited  Lithium Ion Phosphate (LiFePO4) battery cell, module and pack was modeled in LS-DYNA using both Thin Shell Layer (TSL) and Thick Shell

  9. Price of gasoline: forecasting comparisons. [Box-Jenkins, econometric, and regression methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bopp, A.E.; Neri, J.A.

    Gasoline prices are simulated using three popular forecasting methodologies: A Box--Jenkins type method, an econometric method, and a regression method. One-period-ahead and 18-period-ahead comparisons are made. For the one-period-ahead method, a Box--Jenkins type time-series model simulated best, although all do well. However, for the 18-period simulation, the econometric and regression methods perform substantially better than the Box-Jenkins formulation. A rationale for and implications of these results ae discussed. 11 references.

  10. Refinements to the Graves and Pitarka (2010) Broadband Ground-Motion Simulation Method

    DOE PAGES

    Graves, Robert; Pitarka, Arben

    2014-12-17

    This brief article describes refinements to the Graves and Pitarka (2010) broadband ground-motion simulation methodology (GP2010 hereafter) that have been implemented in version 14.3 of the Southern California Earthquake Center (SCEC) Broadband Platform (BBP). The updated version of our method on the current SCEC BBP is referred to as GP14.3. Here, our simulation technique is a hybrid approach that combines low- and high-frequency motions computed with different methods into a single broadband response.

  11. A Simulation-Based Methodology for Developing a Standarized Design Template for Future Battle Command Training Centers

    DTIC Science & Technology

    2006-06-01

    result of changes that run the gamut from space and staff levels to changes in training requirements to the unit composition on a particular...required of them, as well as a simulation tool that can identify the potential impacts on training as a result of changes that run the gamut from space

  12. The GRASP 3: Graphical Reliability Analysis Simulation Program. Version 3: A users' manual and modelling guide

    NASA Technical Reports Server (NTRS)

    Phillips, D. T.; Manseur, B.; Foster, J. W.

    1982-01-01

    Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.

  13. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    NASA Astrophysics Data System (ADS)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the exponent in the crack propagation rate (Paris equation) and the yield strength of the elements are considered in the analytical model. The structural component is assumed to consist of a prescribed number of elements. This Monte Carlo simulation methodology is used to determine the required non-periodic inspections so that the reliability of the structural component will not fall below a prescribed minimum level. A sensitivity analysis is conducted to determine the effect of three key parameters on the specification of the non-periodic inspection intervals: namely a parameter associated with the time to crack initiation, the applied nominal stress fluctuation and the minimum acceptable reliability level.

  14. Iterated transportation simulations for Dallas and Portland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagel, K.; Simon, P.; Rickert, M.

    1998-09-02

    The goal of the TRansportation ANalysis and SIMulation System (TRANSIMS) is to combine the most important aspects of human decision-making related to transportation, from activities planning (sleep, work, eat, shop,...) via modal and route planning to driving, into a single, consistent methodological and software framework. This is meant to combine the functionalities of activities-based travel demand generation, modal choice and route assignment, and micro-simulation. TRANSIMS attempts to employ advanced methodologies in all these modules. Yet, it is probably the overall framework that is the most important part of this attempt. It is, for example, possible to replace the TRANSIMS microsimulationmore » by another micro-simulation that uses the same input and generates the same output. TRANSIMS uses specific regions as examples in order to ensure that the technology is rooted in the real world. Until about the middle of 1997, an approximately five miles by five miles area in Dallas/Texas was used. Since then, TRANSIMS has moved to using data from Portland/Oregon; a case study for this region is planned to be completed by the end of the year 2000. In this paper the authors give short descriptions of these projects and give references to related publications.« less

  15. An engineering closure for heavily under-resolved coarse-grid CFD in large applications

    NASA Astrophysics Data System (ADS)

    Class, Andreas G.; Yu, Fujiang; Jordan, Thomas

    2016-11-01

    Even though high performance computation allows very detailed description of a wide range of scales in scientific computations, engineering simulations used for design studies commonly merely resolve the large scales thus speeding up simulation time. The coarse-grid CFD (CGCFD) methodology is developed for flows with repeated flow patterns as often observed in heat exchangers or porous structures. It is proposed to use inviscid Euler equations on a very coarse numerical mesh. This coarse mesh needs not to conform to the geometry in all details. To reinstall physics on all smaller scales cheap subgrid models are employed. Subgrid models are systematically constructed by analyzing well-resolved generic representative simulations. By varying the flow conditions in these simulations correlations are obtained. These comprehend for each individual coarse mesh cell a volume force vector and volume porosity. Moreover, for all vertices, surface porosities are derived. CGCFD is related to the immersed boundary method as both exploit volume forces and non-body conformal meshes. Yet, CGCFD differs with respect to the coarser mesh and the use of Euler equations. We will describe the methodology based on a simple test case and the application of the method to a 127 pin wire-wrap fuel bundle.

  16. Statistical downscaling of GCM simulations to streamflow using relevance vector machine

    NASA Astrophysics Data System (ADS)

    Ghosh, Subimal; Mujumdar, P. P.

    2008-01-01

    General circulation models (GCMs), the climate models often used in assessing the impact of climate change, operate on a coarse scale and thus the simulation results obtained from GCMs are not particularly useful in a comparatively smaller river basin scale hydrology. The article presents a methodology of statistical downscaling based on sparse Bayesian learning and Relevance Vector Machine (RVM) to model streamflow at river basin scale for monsoon period (June, July, August, September) using GCM simulated climatic variables. NCEP/NCAR reanalysis data have been used for training the model to establish a statistical relationship between streamflow and climatic variables. The relationship thus obtained is used to project the future streamflow from GCM simulations. The statistical methodology involves principal component analysis, fuzzy clustering and RVM. Different kernel functions are used for comparison purpose. The model is applied to Mahanadi river basin in India. The results obtained using RVM are compared with those of state-of-the-art Support Vector Machine (SVM) to present the advantages of RVMs over SVMs. A decreasing trend is observed for monsoon streamflow of Mahanadi due to high surface warming in future, with the CCSR/NIES GCM and B2 scenario.

  17. Simulation-based cutaneous surgical-skill training on a chicken-skin bench model in a medical undergraduate program.

    PubMed

    Denadai, Rafael; Saad-Hossne, Rogério; Martinhão Souto, Luís Ricardo

    2013-05-01

    Because of ethical and medico-legal aspects involved in the training of cutaneous surgical skills on living patients, human cadavers and living animals, it is necessary the search for alternative and effective forms of training simulation. To propose and describe an alternative methodology for teaching and learning the principles of cutaneous surgery in a medical undergraduate program by using a chicken-skin bench model. One instructor for every four students, teaching materials on cutaneous surgical skills, chicken trunks, wings, or thighs, a rigid platform support, needled threads, needle holders, surgical blades with scalpel handles, rat-tooth tweezers, scissors, and marking pens were necessary for training simulation. A proposal for simulation-based training on incision, suture, biopsy, and on reconstruction techniques using a chicken-skin bench model distributed in several sessions and with increasing levels of difficultywas structured. Both feedback and objective evaluations always directed to individual students were also outlined. The teaching of a methodology for the principles of cutaneous surgery using a chicken-skin bench model versatile, portable, easy to assemble, and inexpensive is an alternative and complementary option to the armamentarium of methods based on other bench models described.

  18. Kinetic Monte Carlo simulations for transient thermal fields: Computational methodology and application to the submicrosecond laser processes in implanted silicon.

    PubMed

    Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A

    2012-09-01

    Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.

  19. Application of Observing System Simulation Experiments (OSSEs) to determining science and user requirements for space-based missions

    NASA Astrophysics Data System (ADS)

    Atlas, R. M.

    2016-12-01

    Observing System Simulation Experiments (OSSEs) provide an effective method for evaluating the potential impact of proposed new observing systems, as well as for evaluating trade-offs in observing system design, and in developing and assessing improved methodology for assimilating new observations. As such, OSSEs can be an important tool for determining science and user requirements, and for incorporating these requirements into the planning for future missions. Detailed OSSEs have been conducted at NASA/ GSFC and NOAA/AOML in collaboration with Simpson Weather Associates and operational data assimilation centers over the last three decades. These OSSEs determined correctly the quantitative potential for several proposed satellite observing systems to improve weather analysis and prediction prior to their launch, evaluated trade-offs in orbits, coverage and accuracy for space-based wind lidars, and were used in the development of the methodology that led to the first beneficial impacts of satellite surface winds on numerical weather prediction. In this talk, the speaker will summarize the development of OSSE methodology, early and current applications of OSSEs and how OSSEs will evolve in order to enhance mission planning.

  20. Soil Moisture Content Estimation Based on Sentinel-1 and Auxiliary Earth Observation Products. A Hydrological Approach

    PubMed Central

    Alexakis, Dimitrios D.; Mexis, Filippos-Dimitrios K.; Vozinaki, Anthi-Eirini K.; Daliakopoulos, Ioannis N.; Tsanis, Ioannis K.

    2017-01-01

    A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R2 values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies. PMID:28635625

  1. Eigenspace perturbations for uncertainty estimation of single-point turbulence closures

    NASA Astrophysics Data System (ADS)

    Iaccarino, Gianluca; Mishra, Aashwin Ananda; Ghili, Saman

    2017-02-01

    Reynolds-averaged Navier-Stokes (RANS) models represent the workhorse for predicting turbulent flows in complex industrial applications. However, RANS closures introduce a significant degree of epistemic uncertainty in predictions due to the potential lack of validity of the assumptions utilized in model formulation. Estimating this uncertainty is a fundamental requirement for building confidence in such predictions. We outline a methodology to estimate this structural uncertainty, incorporating perturbations to the eigenvalues and the eigenvectors of the modeled Reynolds stress tensor. The mathematical foundations of this framework are derived and explicated. Thence, this framework is applied to a set of separated turbulent flows, while compared to numerical and experimental data and contrasted against the predictions of the eigenvalue-only perturbation methodology. It is exhibited that for separated flows, this framework is able to yield significant enhancement over the established eigenvalue perturbation methodology in explaining the discrepancy against experimental observations and high-fidelity simulations. Furthermore, uncertainty bounds of potential engineering utility can be estimated by performing five specific RANS simulations, reducing the computational expenditure on such an exercise.

  2. The Carbon Aerosol / Particles Nucleation with a Lidar: Numerical Simulations and Field Studies

    NASA Astrophysics Data System (ADS)

    Miffre, Alain; Anselmo, Christophe; Francis, Mirvatte; David, Gregory; Rairoux, Patrick

    2016-06-01

    In this contribution, we present the results of two recent papers [1,2] published in Optics Express, dedicated to the development of two new lidar methodologies. In [1], while the carbon aerosol (for example, soot particles) is recognized as a major uncertainty on climate and public health, we couple lidar remote sensing with Laser-Induced-Incandescence (LII) to allow retrieving the vertical profile of very low thermal radiation emitted by the carbon aerosol, in agreement with Planck's law, in an urban atmosphere over several hundred meters altitude. In paper [2], awarded as June 2014 OSA Spotlight, we identify the optical requirements ensuring an elastic lidar to be sensitive to new particles formation events (NPF-events) in the atmosphere, while, in the literature, all the ingredients initiating nucleation are still being unrevealed [3]. Both papers proceed with the same methodology by identifying the optical requirements from numerical simulation (Planck and Kirchhoff's laws in [1], Mie and T-matrix numerical codes in [2]), then presenting lidar field application case studies. We believe these new lidar methodologies may be useful for climate, geophysical, as well as fundamental purposes.

  3. Soil Moisture Content Estimation Based on Sentinel-1 and Auxiliary Earth Observation Products. A Hydrological Approach.

    PubMed

    Alexakis, Dimitrios D; Mexis, Filippos-Dimitrios K; Vozinaki, Anthi-Eirini K; Daliakopoulos, Ioannis N; Tsanis, Ioannis K

    2017-06-21

    A methodology for elaborating multi-temporal Sentinel-1 and Landsat 8 satellite images for estimating topsoil Soil Moisture Content (SMC) to support hydrological simulation studies is proposed. After pre-processing the remote sensing data, backscattering coefficient, Normalized Difference Vegetation Index (NDVI), thermal infrared temperature and incidence angle parameters are assessed for their potential to infer ground measurements of SMC, collected at the top 5 cm. A non-linear approach using Artificial Neural Networks (ANNs) is tested. The methodology is applied in Western Crete, Greece, where a SMC gauge network was deployed during 2015. The performance of the proposed algorithm is evaluated using leave-one-out cross validation and sensitivity analysis. ANNs prove to be the most efficient in SMC estimation yielding R² values between 0.7 and 0.9. The proposed methodology is used to support a hydrological simulation with the HEC-HMS model, applied at the Keramianos basin which is ungauged for SMC. Results and model sensitivity highlight the contribution of combining Sentinel-1 SAR and Landsat 8 images for improving SMC estimates and supporting hydrological studies.

  4. Structure-Preserving Variational Multiscale Modeling of Turbulent Incompressible Flow with Subgrid Vortices

    NASA Astrophysics Data System (ADS)

    Evans, John; Coley, Christopher; Aronson, Ryan; Nelson, Corey

    2017-11-01

    In this talk, a large eddy simulation methodology for turbulent incompressible flow will be presented which combines the best features of divergence-conforming discretizations and the residual-based variational multiscale approach to large eddy simulation. In this method, the resolved motion is represented using a divergence-conforming discretization, that is, a discretization that preserves the incompressibility constraint in a pointwise manner, and the unresolved fluid motion is explicitly modeled by subgrid vortices that lie within individual grid cells. The evolution of the subgrid vortices is governed by dynamical model equations driven by the residual of the resolved motion. Consequently, the subgrid vortices appropriately vanish for laminar flow and fully resolved turbulent flow. As the resolved velocity field and subgrid vortices are both divergence-free, the methodology conserves mass in a pointwise sense and admits discrete balance laws for energy, enstrophy, and helicity. Numerical results demonstrate the methodology yields improved results versus state-of-the-art eddy viscosity models in the context of transitional, wall-bounded, and rotational flow when a divergence-conforming B-spline discretization is utilized to represent the resolved motion.

  5. Identification of androgen receptor antagonists: In vitro investigation and classification methodology for flavonoid.

    PubMed

    Wu, Yang; Doering, Jon A; Ma, Zhiyuan; Tang, Song; Liu, Hongling; Zhang, Xiaowei; Wang, Xiaoxiang; Yu, Hongxia

    2016-09-01

    A tremendous gap exists between the number of potential endocrine disrupting chemicals (EDCs) possibly in the environment and the limitation of traditional regulatory testing. In this study, the anti-androgenic potencies of 21 flavonoids were analyzed in vitro, and another 32 flavonoids from the literature were selected as additional chemicals. Molecular dynamic simulations were employed to obtain four different separation approaches based on the different behaviors of ligands and receptors during the process of interaction. Specifically, ligand-receptor complex which highlighted the discriminating features of ligand escape or retention via "mousetrap" mechanism, hydrogen bonds formed during simulation times, ligand stability and the stability of the helix-12 of the receptor were investigated. Together, a methodology was generated that 87.5% of flavonoids could be discriminated as active versus inactive antagonists, and over 90% inactive antagonists could be filtered out before QSAR study. This methodology could be used as a "proof of concept" to identify inactive anti-androgenic flavonoids, as well could be beneficial for rapid risk assessment and regulation of multiple new chemicals for androgenicity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    PubMed

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Development of an Integrated Team Training Design and Assessment Architecture to Support Adaptability in Healthcare Teams

    DTIC Science & Technology

    2017-10-01

    to patient safety by addressing key methodological and conceptual gaps in healthcare simulation-based team training. The investigators are developing...primary outcome of Aim 1a is a conceptually and methodologically sound training design architecture that supports the development and integration of team...should be delivered. This subtask was delayed by approximately 1 month and is now completed. Completed Evaluation of existing experimental dataset to

  8. A Chip and Pixel Qualification Methodology on Imaging Sensors

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Guertin, Steven M.; Petkov, Mihail; Nguyen, Duc N.; Novak, Frank

    2004-01-01

    This paper presents a qualification methodology on imaging sensors. In addition to overall chip reliability characterization based on sensor s overall figure of merit, such as Dark Rate, Linearity, Dark Current Non-Uniformity, Fixed Pattern Noise and Photon Response Non-Uniformity, a simulation technique is proposed and used to project pixel reliability. The projected pixel reliability is directly related to imaging quality and provides additional sensor reliability information and performance control.

  9. The use of coupled atmospheric and hydrological models for water-resources management in headwater basins

    USGS Publications Warehouse

    Leavesley, G.; Hay, L.

    1998-01-01

    Coupled atmospheric and hydrological models provide an opportunity for the improved management of water resources in headwater basins. Issues currently limiting full implementation of coupled-model methodologies include (a) the degree of uncertainty in the accuracy of precipitation and other meteorological variables simulated by atmospheric models, and (b) the problem of discordant scales between atmospheric and bydrological models. Alternative methodologies being developed to address these issues are reviewed.

  10. An applicational process for dynamic balancing of turbomachinery shafting

    NASA Technical Reports Server (NTRS)

    Verhoff, Vincent G.

    1990-01-01

    The NASA Lewis Research Center has developed and implemented a time-efficient methodology for dynamically balancing turbomachinery shafting. This methodology minimizes costly facility downtime by using a balancing arbor (mandrel) that simulates the turbomachinery (rig) shafting. The need for precision dynamic balancing of turbomachinery shafting and for a dynamic balancing methodology is discussed in detail. Additionally, the inherent problems (and their causes and effects) associated with unbalanced turbomachinery shafting as a function of increasing shaft rotational speeds are discussed. Included are the design criteria concerning rotor weight differentials for rotors made of different materials that have similar parameters and shafting. The balancing methodology for applications where rotor replaceability is a requirement is also covered. This report is intended for use as a reference when designing, fabricating, and troubleshooting turbomachinery shafting.

  11. Determination of Time Dependent Virus Inactivation Rates

    NASA Astrophysics Data System (ADS)

    Chrysikopoulos, C. V.; Vogler, E. T.

    2003-12-01

    A methodology is developed for estimating temporally variable virus inactivation rate coefficients from experimental virus inactivation data. The methodology consists of a technique for slope estimation of normalized virus inactivation data in conjunction with a resampling parameter estimation procedure. The slope estimation technique is based on a relatively flexible geostatistical method known as universal kriging. Drift coefficients are obtained by nonlinear fitting of bootstrap samples and the corresponding confidence intervals are obtained by bootstrap percentiles. The proposed methodology yields more accurate time dependent virus inactivation rate coefficients than those estimated by fitting virus inactivation data to a first-order inactivation model. The methodology is successfully applied to a set of poliovirus batch inactivation data. Furthermore, the importance of accurate inactivation rate coefficient determination on virus transport in water saturated porous media is demonstrated with model simulations.

  12. Use of Machine Learning Algorithms to Propose a New Methodology to Conduct, Critique and Validate Urban Scale Building Energy Modeling

    NASA Astrophysics Data System (ADS)

    Pathak, Maharshi

    City administrators and real-estate developers have been setting up rather aggressive energy efficiency targets. This, in turn, has led the building science research groups across the globe to focus on urban scale building performance studies and level of abstraction associated with the simulations of the same. The increasing maturity of the stakeholders towards energy efficiency and creating comfortable working environment has led researchers to develop methodologies and tools for addressing the policy driven interventions whether it's urban level energy systems, buildings' operational optimization or retrofit guidelines. Typically, these large-scale simulations are carried out by grouping buildings based on their design similarities i.e. standardization of the buildings. Such an approach does not necessarily lead to potential working inputs which can make decision-making effective. To address this, a novel approach is proposed in the present study. The principle objective of this study is to propose, to define and evaluate the methodology to utilize machine learning algorithms in defining representative building archetypes for the Stock-level Building Energy Modeling (SBEM) which are based on operational parameter database. The study uses "Phoenix- climate" based CBECS-2012 survey microdata for analysis and validation. Using the database, parameter correlations are studied to understand the relation between input parameters and the energy performance. Contrary to precedence, the study establishes that the energy performance is better explained by the non-linear models. The non-linear behavior is explained by advanced learning algorithms. Based on these algorithms, the buildings at study are grouped into meaningful clusters. The cluster "mediod" (statistically the centroid, meaning building that can be represented as the centroid of the cluster) are established statistically to identify the level of abstraction that is acceptable for the whole building energy simulations and post that the retrofit decision-making. Further, the methodology is validated by conducting Monte-Carlo simulations on 13 key input simulation parameters. The sensitivity analysis of these 13 parameters is utilized to identify the optimum retrofits. From the sample analysis, the envelope parameters are found to be more sensitive towards the EUI of the building and thus retrofit packages should also be directed to maximize the energy usage reduction.

  13. Analysis and design of high-power and efficient, millimeter-wave power amplifier systems using zero degree combiners

    NASA Astrophysics Data System (ADS)

    Tai, Wei; Abbasi, Mortez; Ricketts, David S.

    2018-01-01

    We present the analysis and design of high-power millimetre-wave power amplifier (PA) systems using zero-degree combiners (ZDCs). The methodology presented optimises the PA device sizing and the number of combined unit PAs based on device load pull simulations, driver power consumption analysis and loss analysis of the ZDC. Our analysis shows that an optimal number of N-way combined unit PAs leads to the highest power-added efficiency (PAE) for a given output power. To illustrate our design methodology, we designed a 1-W PA system at 45 GHz using a 45 nm silicon-on-insulator process and showed that an 8-way combined PA has the highest PAE that yields simulated output power of 30.6 dBm and 31% peak PAE.

  14. Methodologies for validating ray-based forward model using finite element method in ultrasonic array data simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Nixon, Andrew; Barber, Tom; Budyn, Nicolas; Bevan, Rhodri; Croxford, Anthony; Wilcox, Paul

    2018-04-01

    In this paper, a methodology of using finite element (FE) model to validate a ray-based model in the simulation of full matrix capture (FMC) ultrasonic array data set is proposed. The overall aim is to separate signal contributions from different interactions in FE results for easier comparing each individual component in the ray-based model results. This is achieved by combining the results from multiple FE models of the system of interest that include progressively more geometrical features while preserving the same mesh structure. It is shown that the proposed techniques allow the interactions from a large number of different ray-paths to be isolated in FE results and compared directly to the results from a ray-based forward model.

  15. Validation of a multi-phase plant-wide model for the description of the aeration process in a WWTP.

    PubMed

    Lizarralde, I; Fernández-Arévalo, T; Beltrán, S; Ayesa, E; Grau, P

    2018-02-01

    This paper introduces a new mathematical model built under the PC-PWM methodology to describe the aeration process in a full-scale WWTP. This methodology enables a systematic and rigorous incorporation of chemical and physico-chemical transformations into biochemical process models, particularly for the description of liquid-gas transfer to describe the aeration process. The mathematical model constructed is able to reproduce biological COD and nitrogen removal, liquid-gas transfer and chemical reactions. The capability of the model to describe the liquid-gas mass transfer has been tested by comparing simulated and experimental results in a full-scale WWTP. Finally, an exploration by simulation has been undertaken to show the potential of the mathematical model. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Multi-Exciter Vibroacoustic Simulation of Hypersonic Flight Vibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GREGORY,DANNY LYNN; CAP,JEROME S.; TOGAMI,THOMAS C.

    1999-11-11

    Many aerospace structures must survive severe high frequency, hypersonic, random vibration during their flights. The random vibrations are generated by the turbulent boundary layer developed along the exterior of the structures during flight. These environments have not been simulated very well in the past using a fixed-based, single exciter input with an upper frequency range of 2 kHz. This study investigates the possibility of using acoustic ardor independently controlled multiple exciters to more accurately simulate hypersonic flight vibration. The test configuration, equipment, and methodology are described. Comparisons with actual flight measurements and previous single exciter simulations are also presented.

  17. Efficient evaluation of wireless real-time control networks.

    PubMed

    Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon

    2015-02-11

    In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.

  18. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  19. Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales

    NASA Technical Reports Server (NTRS)

    Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul

    2010-01-01

    Methodologies for understanding the plastic deformation mechanisms related to crack propagation at the nano-, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.

  20. Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales

    NASA Technical Reports Server (NTRS)

    Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul

    2011-01-01

    Methodologies for understanding the plastic deformation mechanisms related 10 crack propagation at the nano, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.

Top