Construction of dynamic stochastic simulation models using knowledge-based techniques
NASA Technical Reports Server (NTRS)
Williams, M. Douglas; Shiva, Sajjan G.
1990-01-01
Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).
Agent-based modeling: Methods and techniques for simulating human systems
Bonabeau, Eric
2002-01-01
Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Simulations of multi-contrast x-ray imaging using near-field speckles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zdora, Marie-Christine; Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom and Department of Physics & Astronomy, University College London, London, WC1E 6BT; Thibault, Pierre
2016-01-28
X-ray dark-field and phase-contrast imaging using near-field speckles is a novel technique that overcomes limitations inherent in conventional absorption x-ray imaging, i.e. poor contrast for features with similar density. Speckle-based imaging yields a wealth of information with a simple setup tolerant to polychromatic and divergent beams, and simple data acquisition and analysis procedures. Here, we present a simulation software used to model the image formation with the speckle-based technique, and we compare simulated results on a phantom sample with experimental synchrotron data. Thorough simulation of a speckle-based imaging experiment will help for better understanding and optimising the technique itself.
Nonlinear relaxation algorithms for circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, R.A.
Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less
Latif, Rana K; Bautista, Alexander F; Memon, Saima B; Smith, Elizabeth A; Wang, Chenxi; Wadhwa, Anupama; Carter, Mary B; Akca, Ozan
2012-03-01
Our goal was to determine whether simulation combined with didactic training improves sterile technique during ultrasound (US)-guided central venous catheter (CVC) insertion compared with didactic training alone among novices. We hypothesized that novices who receive combined didactic and simulation-based training would perform similarly to experienced residents in aseptic technique, knowledge, and perception of comfort during US-guided CVC insertion on a simulator. Seventy-two subjects were enrolled in a randomized, controlled trial of an educational intervention. Fifty-four novices were randomized into either the didactic group or the simulation combined with didactic group. Both groups received didactic training but the simulation combined with didactic group also received simulation-based CVC insertion training. Both groups were tested by demonstrating US-guided CVC insertion on a simulator. Aseptic technique was scored on 8 steps as "yes/no" and also using a 7-point Likert scale with 7 being "excellent technique" by a rater blinded to subject randomization. After initial testing, the didactic group was offered simulation-based training and retesting. Both groups also took a pre- and posttraining test of knowledge and rated their comfort with US and CVC insertion pre- and posttraining on a 5-point Likert scale. Subsequently, 18 experienced residents also took the test of knowledge, rated their comfort level, and were scored while performing aseptic US-guided CVC insertion using a simulator. The simulation combined with didactic group achieved a 167% (95% confidence interval [CI] 133%-167%) incremental increase in yes/no scores and 115% (CI 112%-127%) incremental increase in Likert scale ratings on aseptic technique compared with novices in the didactic group. Compared with experienced residents, simulation combined with didactic trained novices achieved an increase in aseptic scores with a 33.3% (CI 16.7%-50%) increase in yes/no ratings and a 20% (CI 13.3%-40%) increase in Likert scaled ratings, and scored 2.5-fold higher on the test of knowledge. There was a 3-fold increase in knowledge and 2-fold increase in comfort level among all novices (P < 0.001) after combined didactic and simulation-based training. Simulation combined with didactic training is superior to didactic training alone for acquisition of clinical skills such as US-guided CVC insertion. After combined didactic and simulation-based training, novices can outperform experienced residents in aseptic technique as well as in measurements of knowledge.
NASA Astrophysics Data System (ADS)
Lachinova, Svetlana L.; Vorontsov, Mikhail A.; Filimonov, Grigory A.; LeMaster, Daniel A.; Trippel, Matthew E.
2017-07-01
Computational efficiency and accuracy of wave-optics-based Monte-Carlo and brightness function numerical simulation techniques for incoherent imaging of extended objects through atmospheric turbulence are evaluated. Simulation results are compared with theoretical estimates based on known analytical solutions for the modulation transfer function of an imaging system and the long-exposure image of a Gaussian-shaped incoherent light source. It is shown that the accuracy of both techniques is comparable over the wide range of path lengths and atmospheric turbulence conditions, whereas the brightness function technique is advantageous in terms of the computational speed.
Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando
2017-01-01
Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.
Using cognitive task analysis to develop simulation-based training for medical tasks.
Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette
2013-10-01
Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Anonymity and Historical-Anonymity in Location-Based Services
NASA Astrophysics Data System (ADS)
Bettini, Claudio; Mascetti, Sergio; Wang, X. Sean; Freni, Dario; Jajodia, Sushil
The problem of protecting user’s privacy in Location-Based Services (LBS) has been extensively studied recently and several defense techniques have been proposed. In this contribution, we first present a categorization of privacy attacks and related defenses. Then, we consider the class of defense techniques that aim at providing privacy through anonymity and in particular algorithms achieving “historical k- anonymity” in the case of the adversary obtaining a trace of requests recognized as being issued by the same (anonymous) user. Finally, we investigate the issues involved in the experimental evaluation of anonymity based defense techniques; we show that user movement simulations based on mostly random movements can lead to overestimate the privacy protection in some cases and to overprotective techniques in other cases. The above results are obtained by comparison to a more realistic simulation with an agent-based simulator, considering a specific deployment scenario.
Simulation-based learning: Just like the real thing
Lateef, Fatimah
2010-01-01
Simulation is a technique for practice and learning that can be applied to many different disciplines and trainees. It is a technique (not a technology) to replace and amplify real experiences with guided ones, often “immersive” in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion. Simulation-based learning can be the way to develop health professionals’ knowledge, skills, and attitudes, whilst protecting patients from unnecessary risks. Simulation-based medical education can be a platform which provides a valuable tool in learning to mitigate ethical tensions and resolve practical dilemmas. Simulation-based training techniques, tools, and strategies can be applied in designing structured learning experiences, as well as be used as a measurement tool linked to targeted teamwork competencies and learning objectives. It has been widely applied in fields such aviation and the military. In medicine, simulation offers good scope for training of interdisciplinary medical teams. The realistic scenarios and equipment allows for retraining and practice till one can master the procedure or skill. An increasing number of health care institutions and medical schools are now turning to simulation-based learning. Teamwork training conducted in the simulated environment may offer an additive benefit to the traditional didactic instruction, enhance performance, and possibly also help reduce errors. PMID:21063557
Simulation-based learning: Just like the real thing.
Lateef, Fatimah
2010-10-01
Simulation is a technique for practice and learning that can be applied to many different disciplines and trainees. It is a technique (not a technology) to replace and amplify real experiences with guided ones, often "immersive" in nature, that evoke or replicate substantial aspects of the real world in a fully interactive fashion. Simulation-based learning can be the way to develop health professionals' knowledge, skills, and attitudes, whilst protecting patients from unnecessary risks. Simulation-based medical education can be a platform which provides a valuable tool in learning to mitigate ethical tensions and resolve practical dilemmas. Simulation-based training techniques, tools, and strategies can be applied in designing structured learning experiences, as well as be used as a measurement tool linked to targeted teamwork competencies and learning objectives. It has been widely applied in fields such aviation and the military. In medicine, simulation offers good scope for training of interdisciplinary medical teams. The realistic scenarios and equipment allows for retraining and practice till one can master the procedure or skill. An increasing number of health care institutions and medical schools are now turning to simulation-based learning. Teamwork training conducted in the simulated environment may offer an additive benefit to the traditional didactic instruction, enhance performance, and possibly also help reduce errors.
Cost considerations in using simulations for medical training.
Fletcher, J D; Wind, Alexander P
2013-10-01
This article reviews simulation used for medical training, techniques for assessing simulation-based training, and cost analyses that can be included in such assessments. Simulation in medical training appears to take four general forms: human actors who are taught to simulate illnesses and ailments in standardized ways; virtual patients who are generally presented via computer-controlled, multimedia displays; full-body manikins that simulate patients using electronic sensors, responders, and controls; and part-task anatomical simulations of various body parts and systems. Techniques for assessing costs include benefit-cost analysis, return on investment, and cost-effectiveness analysis. Techniques for assessing the effectiveness of simulation-based medical training include the use of transfer effectiveness ratios and incremental transfer effectiveness ratios to measure transfer of knowledge and skill provided by simulation to the performance of medical procedures. Assessment of costs and simulation effectiveness can be combined with measures of transfer using techniques such as isoperformance analysis to identify ways of minimizing costs without reducing performance effectiveness or maximizing performance without increasing costs. In sum, economic analysis must be considered in training assessments if training budgets are to compete successfully with other requirements for funding. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
NASA Astrophysics Data System (ADS)
Shabliy, L. S.; Malov, D. V.; Bratchinin, D. S.
2018-01-01
In the article the description of technique for simulation of valves for pneumatic-hydraulic system of liquid-propellant rocket engine (LPRE) is given. Technique is based on approach of computational hydrodynamics (Computational Fluid Dynamics - CFD). The simulation of a differential valve used in closed circuit LPRE supply pipes of fuel components is performed to show technique abilities. A schematic and operation algorithm of this valve type is described in detail. Also assumptions made in the construction of the geometric model of the hydraulic path of the valve are described in detail. The calculation procedure for determining valve hydraulic characteristics is given. Based on these calculations certain hydraulic characteristics of the valve are given. Some ways of usage of the described simulation technique for research the static and dynamic characteristics of the elements of the pneumatic-hydraulic system of LPRE are proposed.
Assessment of simulation fidelity using measurements of piloting technique in flight
NASA Technical Reports Server (NTRS)
Clement, W. F.; Cleveland, W. B.; Key, D. L.
1984-01-01
The U.S. Army and NASA joined together on a project to conduct a systematic investigation and validation of a ground based piloted simulation of the Army/Sikorsky UH-60A helicopter. Flight testing was an integral part of the validation effort. Nap-of-the-Earth (NOE) piloting tasks which were investigated included the bob-up, the hover turn, the dash/quickstop, the sidestep, the dolphin, and the slalom. Results from the simulation indicate that the pilot's NOE task performance in the simulator is noticeably and quantifiably degraded when compared with the task performance results generated in flight test. The results of the flight test and ground based simulation experiments support a unique rationale for the assessment of simulation fidelity: flight simulation fidelity should be judged quantitatively by measuring pilot's control strategy and technique as induced by the simulator. A quantitative comparison is offered between the piloting technique observed in a flight simulator and that observed in flight test for the same tasks performed by the same pilots.
Methodology development for evaluation of selective-fidelity rotorcraft simulation
NASA Technical Reports Server (NTRS)
Lewis, William D.; Schrage, D. P.; Prasad, J. V. R.; Wolfe, Daniel
1992-01-01
This paper addressed the initial step toward the goal of establishing performance and handling qualities acceptance criteria for realtime rotorcraft simulators through a planned research effort to quantify the system capabilities of 'selective fidelity' simulators. Within this framework the simulator is then classified based on the required task. The simulator is evaluated by separating the various subsystems (visual, motion, etc.) and applying corresponding fidelity constants based on the specific task. This methodology not only provides an assessment technique, but also provides a technique to determine the required levels of subsystem fidelity for a specific task.
Accurate low-cost methods for performance evaluation of cache memory systems
NASA Technical Reports Server (NTRS)
Laha, Subhasis; Patel, Janak H.; Iyer, Ravishankar K.
1988-01-01
Methods of simulation based on statistical techniques are proposed to decrease the need for large trace measurements and for predicting true program behavior. Sampling techniques are applied while the address trace is collected from a workload. This drastically reduces the space and time needed to collect the trace. Simulation techniques are developed to use the sampled data not only to predict the mean miss rate of the cache, but also to provide an empirical estimate of its actual distribution. Finally, a concept of primed cache is introduced to simulate large caches by the sampling-based method.
Algorithms and architecture for multiprocessor based circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deutsch, J.T.
Accurate electrical simulation is critical to the design of high performance integrated circuits. Logic simulators can verify function and give first-order timing information. Switch level simulators are more effective at dealing with charge sharing than standard logic simulators, but cannot provide accurate timing information or discover DC problems. Delay estimation techniques and cell level simulation can be used in constrained design methods, but must be tuned for each application, and circuit simulation must still be used to generate the cell models. None of these methods has the guaranteed accuracy that many circuit designers desire, and none can provide detailed waveformmore » information. Detailed electrical-level simulation can predict circuit performance if devices and parasitics are modeled accurately. However, the computational requirements of conventional circuit simulators make it impractical to simulate current large circuits. In this dissertation, the implementation of Iterated Timing Analysis (ITA), a relaxation-based technique for accurate circuit simulation, on a special-purpose multiprocessor is presented. The ITA method is an SOR-Newton, relaxation-based method which uses event-driven analysis and selective trace to exploit the temporal sparsity of the electrical network. Because event-driven selective trace techniques are employed, this algorithm lends itself to implementation on a data-driven computer.« less
MESA: An Interactive Modeling and Simulation Environment for Intelligent Systems Automation
NASA Technical Reports Server (NTRS)
Charest, Leonard
1994-01-01
This report describes MESA, a software environment for creating applications that automate NASA mission opterations. MESA enables intelligent automation by utilizing model-based reasoning techniques developed in the field of Artificial Intelligence. Model-based reasoning techniques are realized in Mesa through native support of causal modeling and discrete event simulation.
The National Shipbuilding Research Program, Computer Aided Process Planning for Shipyards
1986-08-01
Factory Simulation with Conventional Factory Planning Techniques Financial Justification of State-of-the-Art Investment: A Study Using CAPP I–5 T I T L...and engineer to order.” “Factory Simulation: Approach to Integration of Computer- Based Factory Simulation with Conventional Factory Planning Techniques
A demonstrative model of a lunar base simulation on a personal computer
NASA Technical Reports Server (NTRS)
1985-01-01
The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.
NASA Astrophysics Data System (ADS)
Demir, I.
2013-12-01
Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.
Using Simulation to Improve Systems-Based Practices.
Gardner, Aimee K; Johnston, Maximilian; Korndorffer, James R; Haque, Imad; Paige, John T
2017-09-01
Ensuring the safe, effective management of patients requires efficient processes of care within a smoothly operating system in which highly reliable teams of talented, skilled health care providers are able to use the vast array of high-technology resources and intensive care techniques available. Simulation can play a unique role in exploring and improving the complex perioperative system by proactively identifying latent safety threats and mitigating their damage to ensure that all those who work in this critical health care environment can provide optimal levels of patient care. A panel of five experts from a wide range of institutions was brought together to discuss the added value of simulation-based training for improving systems-based aspects of the perioperative service line. Panelists shared the way in which simulation was demonstrated at their institutions. The themes discussed by each panel member were delineated into four avenues through which simulation-based techniques have been used. Simulation-based techniques are being used in (1) testing new clinical workspaces and facilities before they open to identify potential latent conditions; (2) practicing how to identify the deteriorating patient and escalate care in an effective manner; (3) performing prospective root cause analyses to address system weaknesses leading to sentinel events; and (4) evaluating the efficiency and effectiveness of the electronic health record in the perioperative setting. This focused review of simulation-based interventions to test and improve components of the perioperative microsystem, which includes literature that has emerged since the panel's presentation, highlights the broad-based utility of simulation-based technologies in health care. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.
Knowledge-based simulation using object-oriented programming
NASA Technical Reports Server (NTRS)
Sidoran, Karen M.
1993-01-01
Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.
Investigation of Propagation in Foliage Using Simulation Techniques
2011-12-01
simulation models provide a rough approximation to radiowave propagation in an actual rainforest environment. Based on the simulated results, the...simulation models provide a rough approximation to radiowave propagation in an actual rainforest environment. Based on the simulated results, the path... Rainforest ...............................2 2. Electrical Properties of a Forest .........................................................3 B. OBJECTIVES OF
NASA Astrophysics Data System (ADS)
Petsev, Nikolai D.; Leal, L. Gary; Shell, M. Scott
2017-12-01
Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely resolved (e.g., molecular dynamics) and coarse-grained (e.g., continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 084115 (2016)], simulated using a particle-based continuum method known as smoothed dissipative particle dynamics. An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.
Training and certification in endobronchial ultrasound-guided transbronchial needle aspiration
Konge, Lars; Nayahangan, Leizl Joy; Clementsen, Paul Frost
2017-01-01
Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) plays a key role in the staging of lung cancer, which is crucial for allocation to surgical treatment. EBUS-TBNA is a complicated procedure and simulation-based training is helpful in the first part of the long learning curve prior to performing the procedure on actual patients. New trainees should follow a structured training programme consisting of training on simulators to proficiency as assessed with a validated test followed by supervised practice on patients. The simulation-based training is superior to the traditional apprenticeship model and is recommended in the newest guidelines. EBUS-TBNA and oesophageal ultrasound-guided fine needle aspiration (EUS-FNA or EUS-B-FNA) are complementary to each other and the combined techniques are superior to either technique alone. It is logical to learn and to perform the two techniques in combination, however, for lung cancer staging solely EBUS-TBNA simulators exist, but hopefully in the future simulation-based training in EUS will be possible. PMID:28840013
Nearest Neighbor Searching in Binary Search Trees: Simulation of a Multiprocessor System.
ERIC Educational Resources Information Center
Stewart, Mark; Willett, Peter
1987-01-01
Describes the simulation of a nearest neighbor searching algorithm for document retrieval using a pool of microprocessors. Three techniques are described which allow parallel searching of a binary search tree as well as a PASCAL-based system, PASSIM, which can simulate these techniques. Fifty-six references are provided. (Author/LRW)
Efficient techniques for wave-based sound propagation in interactive applications
NASA Astrophysics Data System (ADS)
Mehra, Ravish
Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data-driven, rotating or time-varying directivity function at runtime. Unlike previous approaches, the listener directivity approach can be used to compute spatial audio (3D audio) for a moving, rotating listener at interactive rates. Lastly, we propose an efficient GPU-based time-domain solver for the wave equation that enables wave simulation up to the mid-frequency range in tens of minutes on a desktop computer. It is demonstrated that by carefully mapping all the components of the wave simulator to match the parallel processing capabilities of the graphics processors, significant improvement in performance can be achieved compared to the CPU-based simulators, while maintaining numerical accuracy. We validate these techniques with offline numerical simulations and measured data recorded in an outdoor scene. We present results of preliminary user evaluations conducted to study the impact of these techniques on user's immersion in virtual environment. We have integrated these techniques with the Half-Life 2 game engine, Oculus Rift head-mounted display, and Xbox game controller to enable users to experience high-quality acoustics effects and spatial audio in the virtual environment.
Simulation verification techniques study: Simulation self test hardware design and techniques report
NASA Technical Reports Server (NTRS)
1974-01-01
The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.
Auditorium acoustics evaluation based on simulated impulse response
NASA Astrophysics Data System (ADS)
Wu, Shuoxian; Wang, Hongwei; Zhao, Yuezhe
2004-05-01
The impulse responses and other acoustical parameters of Huangpu Teenager Palace in Guangzhou were measured. Meanwhile, the acoustical simulation and auralization based on software ODEON were also made. The comparison between the parameters based on computer simulation and measuring is given. This case study shows that auralization technique based on computer simulation can be used for predicting the acoustical quality of a hall at its design stage.
The Numerical Technique for the Landslide Tsunami Simulations Based on Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Kozelkov, A. S.
2017-12-01
The paper presents an integral technique simulating all phases of a landslide-driven tsunami. The technique is based on the numerical solution of the system of Navier-Stokes equations for multiphase flows. The numerical algorithm uses a fully implicit approximation method, in which the equations of continuity and momentum conservation are coupled through implicit summands of pressure gradient and mass flow. The method we propose removes severe restrictions on the time step and allows simulation of tsunami propagation to arbitrarily large distances. The landslide origin is simulated as an individual phase being a Newtonian fluid with its own density and viscosity and separated from the water and air phases by an interface. The basic formulas of equation discretization and expressions for coefficients are presented, and the main steps of the computation procedure are described in the paper. To enable simulations of tsunami propagation across wide water areas, we propose a parallel algorithm of the technique implementation, which employs an algebraic multigrid method. The implementation of the multigrid method is based on the global level and cascade collection algorithms that impose no limitations on the paralleling scale and make this technique applicable to petascale systems. We demonstrate the possibility of simulating all phases of a landslide-driven tsunami, including its generation, propagation and uprush. The technique has been verified against the problems supported by experimental data. The paper describes the mechanism of incorporating bathymetric data to simulate tsunamis in real water areas of the world ocean. Results of comparison with the nonlinear dispersion theory, which has demonstrated good agreement, are presented for the case of a historical tsunami of volcanic origin on the Montserrat Island in the Caribbean Sea.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kotalczyk, G., E-mail: Gregor.Kotalczyk@uni-due.de; Kruis, F.E.
Monte Carlo simulations based on weighted simulation particles can solve a variety of population balance problems and allow thus to formulate a solution-framework for many chemical engineering processes. This study presents a novel concept for the calculation of coagulation rates of weighted Monte Carlo particles by introducing a family of transformations to non-weighted Monte Carlo particles. The tuning of the accuracy (named ‘stochastic resolution’ in this paper) of those transformations allows the construction of a constant-number coagulation scheme. Furthermore, a parallel algorithm for the inclusion of newly formed Monte Carlo particles due to nucleation is presented in the scope ofmore » a constant-number scheme: the low-weight merging. This technique is found to create significantly less statistical simulation noise than the conventional technique (named ‘random removal’ in this paper). Both concepts are combined into a single GPU-based simulation method which is validated by comparison with the discrete-sectional simulation technique. Two test models describing a constant-rate nucleation coupled to a simultaneous coagulation in 1) the free-molecular regime or 2) the continuum regime are simulated for this purpose.« less
Design of Energetic Ionic Liquids (Preprint)
2008-05-07
mesoscale-level simulations of bulk ionic liquids based upon multiscale coarse graining techniques. 15. SUBJECT TERMS 16. SECURITY...simulations utilizing polarizable force fields, and mesoscale-level simulations of bulk ionic liquids based upon multiscale coarse graining...Simulations of the Energetic Ionic Liquid 1-hydroxyethyl-4-amino-1, 2, 4- triazolium Nitrate (HEATN): Molecular dynamics (MD) simulations have been
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
Systems modeling and simulation applications for critical care medicine
2012-01-01
Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718
ERIC Educational Resources Information Center
Dieckmann, Peter; Friis, Susanne Molin; Lippert, Anne; Ostergaard, Doris
2012-01-01
Introduction: This study describes (a) process goals, (b) success factors, and (c) barriers for optimizing simulation-based learning environments within the simulation setting model developed by Dieckmann. Methods: Seven simulation educators of different experience levels were interviewed using the Critical Incident Technique. Results: (a) The…
User modeling techniques for enhanced usability of OPSMODEL operations simulation software
NASA Technical Reports Server (NTRS)
Davis, William T.
1991-01-01
The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.
NASA Astrophysics Data System (ADS)
Wang, P.; Becker, A. A.; Jones, I. A.; Glover, A. T.; Benford, S. D.; Vloeberghs, M.
2009-08-01
A virtual-reality real-time simulation of surgical operations that incorporates the inclusion of a hard tumour is presented. The software is based on Boundary Element (BE) technique. A review of the BE formulation for real-time analysis of two-domain deformable objects, using the pre-solution technique, is presented. The two-domain BE software is incorporated into a surgical simulation system called VIRS to simulate the initiation of a cut on the surface of the soft tissue and extending the cut deeper until the tumour is reached.
Acceleration techniques for dependability simulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Barnette, James David
1995-01-01
As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.
Estimation variance bounds of importance sampling simulations in digital communication systems
NASA Technical Reports Server (NTRS)
Lu, D.; Yao, K.
1991-01-01
In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.
NASA Technical Reports Server (NTRS)
Bedewi, Nabih E.; Yang, Jackson C. S.
1987-01-01
Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The mathematics of the technique is presented in addition to the results of computer simulations conducted to demonstrate the prediction of the response of the system and the random forcing function initially introduced to excite the system.
NASA Astrophysics Data System (ADS)
Gao, Xiatian; Wang, Xiaogang; Jiang, Binhao
2017-10-01
UPSF (Universal Plasma Simulation Framework) is a new plasma simulation code designed for maximum flexibility by using edge-cutting techniques supported by C++17 standard. Through use of metaprogramming technique, UPSF provides arbitrary dimensional data structures and methods to support various kinds of plasma simulation models, like, Vlasov, particle in cell (PIC), fluid, Fokker-Planck, and their variants and hybrid methods. Through C++ metaprogramming technique, a single code can be used to arbitrary dimensional systems with no loss of performance. UPSF can also automatically parallelize the distributed data structure and accelerate matrix and tensor operations by BLAS. A three-dimensional particle in cell code is developed based on UPSF. Two test cases, Landau damping and Weibel instability for electrostatic and electromagnetic situation respectively, are presented to show the validation and performance of the UPSF code.
Liu, Xin
2014-01-01
This study describes a deterministic method for simulating the first-order scattering in a medical computed tomography scanner. The method was developed based on a physics model of x-ray photon interactions with matter and a ray tracing technique. The results from simulated scattering were compared to the ones from an actual scattering measurement. Two phantoms with homogeneous and heterogeneous material distributions were used in the scattering simulation and measurement. It was found that the simulated scatter profile was in agreement with the measurement result, with an average difference of 25% or less. Finally, tomographic images with artifacts caused by scatter were corrected based on the simulated scatter profiles. The image quality improved significantly.
Quality assurance paradigms for artificial intelligence in modelling and simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oren, T.I.
1987-04-01
New classes of quality assurance concepts and techniques are required for the advanced knowledge-processing paradigms (such as artificial intelligence, expert systems, or knowledge-based systems) and the complex problems that only simulative systems can cope with. A systematization of quality assurance problems as well as examples are given to traditional and cognizant quality assurance techniques in traditional and cognizant modelling and simulation.
NASA Technical Reports Server (NTRS)
Moin, Parviz; Spalart, Philippe R.
1987-01-01
The use of simulation data bases for the examination of turbulent flows is an effective research tool. Studies of the structure of turbulence have been hampered by the limited number of probes and the impossibility of measuring all desired quantities. Also, flow visualization is confined to the observation of passive markers with limited field of view and contamination caused by time-history effects. Computer flow fields are a new resource for turbulence research, providing all the instantaneous flow variables in three-dimensional space. Simulation data bases also provide much-needed information for phenomenological turbulence modeling. Three dimensional velocity and pressure fields from direct simulations can be used to compute all the terms in the transport equations for the Reynolds stresses and the dissipation rate. However, only a few, geometrically simple flows have been computed by direct numerical simulation, and the inventory of simulation does not fully address the current modeling needs in complex turbulent flows. The availability of three-dimensional flow fields also poses challenges in developing new techniques for their analysis, techniques based on experimental methods, some of which are used here for the analysis of direct-simulation data bases in studies of the mechanics of turbulent flows.
NASA Astrophysics Data System (ADS)
Miller, D. J.; Zhang, Z.; Ackerman, A. S.; Platnick, S. E.; Cornet, C.
2016-12-01
A remote sensing cloud retrieval simulator, created by coupling an LES cloud model with vector radiative transfer (RT) models is the ideal framework for assessing cloud remote sensing techniques. This simulator serves as a tool for understanding bi-spectral and polarimetric retrievals by comparing them directly to LES cloud properties (retrieval closure comparison) and for comparing the retrieval techniques to one another. Our simulator utilizes the DHARMA LES [Ackerman et al., 2004] with cloud properties based on marine boundary layer (MBL) clouds observed during the DYCOMS-II and ATEX field campaigns. The cloud reflectances are produced by the vectorized RT models based on polarized doubling adding and monte carlo techniques (PDA, MCPOL). Retrievals are performed utilizing techniques as similar as possible to those implemented on their corresponding well known instruments; polarimetric retrievals are based on techniques implemented for polarimeters (POLDER, AirMSPI, and RSP) and bi-spectral retrievals are performed using the Nakajima-King LUT method utilized on a number of spectral instruments (MODIS and VIIRS). Retrieval comparisons focus on cloud droplet effective radius (re), effective variance (ve), and cloud optical thickness (τ). This work explores the sensitivities of these two retrieval techniques to various observation limitations, such as spatial resolution/cloud inhomogeneity, impact of 3D radiative effects, and angular resolution requirements. With future remote sensing missions like NASA's Aerosols/Clouds/Ecosystems (ACE) planning to feature advanced polarimetric instruments it is important to understand how these retrieval techniques compare to one another. The cloud retrieval simulator we've developed allows us to probe these important questions in a realistically relevant test bed.
Advanced computer graphic techniques for laser range finder (LRF) simulation
NASA Astrophysics Data System (ADS)
Bedkowski, Janusz; Jankowski, Stanislaw
2008-11-01
This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.
Physics-based approach to haptic display
NASA Technical Reports Server (NTRS)
Brown, J. Michael; Colgate, J. Edward
1994-01-01
This paper addresses the implementation of complex multiple degree of freedom virtual environments for haptic display. We suggest that a physics based approach to rigid body simulation is appropriate for hand tool simulation, but that currently available simulation techniques are not sufficient to guarantee successful implementation. We discuss the desirable features of a virtual environment simulation, specifically highlighting the importance of stability guarantees.
Modelling and Simulation for Requirements Engineering and Options Analysis
2010-05-01
should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments
"Intelligent" Computer Assisted Instruction (CAI) Applications. Interim Report.
ERIC Educational Resources Information Center
Brown, John Seely; And Others
Interim work is documented describing efforts to modify computer techniques used to recognize and process English language requests to an instructional simulator. The conversion from a hand-coded to a table driven technique are described in detail. Other modifications to a simulation based computer assisted instruction program to allow a gaming…
Physics-based interactive volume manipulation for sharing surgical process.
Nakao, Megumi; Minato, Kotaro
2010-05-01
This paper presents a new set of techniques by which surgeons can interactively manipulate patient-specific volumetric models for sharing surgical process. To handle physical interaction between the surgical tools and organs, we propose a simple surface-constraint-based manipulation algorithm to consistently simulate common surgical manipulations such as grasping, holding and retraction. Our computation model is capable of simulating soft-tissue deformation and incision in real time. We also present visualization techniques in order to rapidly visualize time-varying, volumetric information on the deformed image. This paper demonstrates the success of the proposed methods in enabling the simulation of surgical processes, and the ways in which this simulation facilitates preoperative planning and rehearsal.
2014-03-01
purpose of the study was to determine if the use of a simulator is at least as effective in marksmanship training as traditional dry fire techniques...determine if the use of a simulator is at least as effective in marksmanship training as traditional dry fire techniques. A between-groups study with a...marksmanship. Naval commands could use the information to effectively maintain gun qualifications for inport duty section watch bills and constant anti
Translating the simulation of procedural drilling techniques for interactive neurosurgical training.
Stredney, Don; Rezai, Ali R; Prevedello, Daniel M; Elder, J Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J
2013-10-01
Through previous efforts we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. These volumetric data help drive an interactive multisensory, ie, visual (stereo), aural (stereo), and tactile, simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the Congress of Neurological Surgeons simulation initiative. To deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. We discuss issues of biofidelity and our methods to provide objective, quantitative and automated assessment for the residents. We conclude with a discussion of our experiences by reporting preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum.
An earth imaging camera simulation using wide-scale construction of reflectance surfaces
NASA Astrophysics Data System (ADS)
Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk
2013-10-01
Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.
eLearning techniques supporting problem based learning in clinical simulation.
Docherty, Charles; Hoy, Derek; Topp, Helena; Trinder, Kathryn
2005-08-01
This paper details the results of the first phase of a project using eLearning to support students' learning within a simulated environment. The locus was a purpose built clinical simulation laboratory (CSL) where the School's philosophy of problem based learning (PBL) was challenged through lecturers using traditional teaching methods. a student-centred, problem based approach to the acquisition of clinical skills that used high quality learning objects embedded within web pages, substituting for lecturers providing instruction and demonstration. This encouraged student nurses to explore, analyse and make decisions within the safety of a clinical simulation. Learning was facilitated through network communications and reflection on video performances of self and others. Evaluations were positive, students demonstrating increased satisfaction with PBL, improved performance in exams, and increased self-efficacy in the performance of nursing activities. These results indicate that eLearning techniques can help students acquire clinical skills in the safety of a simulated environment within the context of a problem based learning curriculum.
The Business Flight Simulator.
ERIC Educational Resources Information Center
Dwyer, P.; Simpson, D.
1989-01-01
The authors describe a simulation program based on a workshop approach designed for postsecondary business students. Features and benefits of the workshop technique are discussed. The authors cover practical aspects of designing and implementing simulation workshops. (CH)
Liu, Heng-Liang; Lin, Chun-Li; Sun, Ming-Tsung; Chang, Yen-Hsiang
2010-06-01
This study investigates micro-crack propagation at the enamel/adhesive interface using finite element (FE) submodeling and element death techniques. A three-dimensional (3D) FE macro-model of the enamel/adhesive/ceramic subjected to shear bond testing was generated and analyzed. A 3D micro-model with interfacial bonding structure was constructed at the upper enamel/adhesive interface where the stress concentration was found from the macro-model results. The morphology of this interfacial bonding structure (i.e., resin tag) was assigned based on resin tag geometry and enamel rod arrangement from a scanning electron microscopy micrograph. The boundary conditions for the micro-model were determined from the macro-model results. A custom iterative code combined with the element death technique was used to calculate the micro-crack propagation. Parallel experiments were performed to validate this FE simulation. The stress concentration within the adhesive occurred mainly at the upper corner near the enamel/adhesive interface and the resin tag base. A simulated fracture path was found at the resin tag base along the enamel/adhesive interface. A morphological observation of the fracture patterns obtained from in vitro testing corresponded with the simulation results. This study shows that the FE submodeling and element death techniques could be used to simulate the 3D micro-stress pattern and the crack propagation noted at the enamel/adhesive interface.
Large Terrain Modeling and Visualization for Planets
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan; Cameron, Jonathan; Lim, Christopher
2011-01-01
Physics-based simulations are actively used in the design, testing, and operations phases of surface and near-surface planetary space missions. One of the challenges in realtime simulations is the ability to handle large multi-resolution terrain data sets within models as well as for visualization. In this paper, we describe special techniques that we have developed for visualization, paging, and data storage for dealing with these large data sets. The visualization technique uses a real-time GPU-based continuous level-of-detail technique that delivers multiple frames a second performance even for planetary scale terrain model sizes.
Visualizing Time-Varying Phenomena In Numerical Simulations Of Unsteady Flows
NASA Technical Reports Server (NTRS)
Lane, David A.
1996-01-01
Streamlines, contour lines, vector plots, and volume slices (cutting planes) are commonly used for flow visualization. These techniques are sometimes referred to as instantaneous flow visualization techniques because calculations are based on an instant of the flowfield in time. Although instantaneous flow visualization techniques are effective for depicting phenomena in steady flows,they sometimes do not adequately depict time-varying phenomena in unsteady flows. Streaklines and timelines are effective visualization techniques for depicting vortex shedding, vortex breakdown, and shock waves in unsteady flows. These techniques are examples of time-dependent flow visualization techniques, which are based on many instants of the flowfields in time. This paper describes the algorithms for computing streaklines and timelines. Using numerically simulated unsteady flows, streaklines and timelines are compared with streamlines, contour lines, and vector plots. It is shown that streaklines and timelines reveal vortex shedding and vortex breakdown more clearly than instantaneous flow visualization techniques.
Knowledge-based simulation for aerospace systems
NASA Technical Reports Server (NTRS)
Will, Ralph W.; Sliwa, Nancy E.; Harrison, F. Wallace, Jr.
1988-01-01
Knowledge-based techniques, which offer many features that are desirable in the simulation and development of aerospace vehicle operations, exhibit many similarities to traditional simulation packages. The eventual solution of these systems' current symbolic processing/numeric processing interface problem will lead to continuous and discrete-event simulation capabilities in a single language, such as TS-PROLOG. Qualitative, totally-symbolic simulation methods are noted to possess several intrinsic characteristics that are especially revelatory of the system being simulated, and capable of insuring that all possible behaviors are considered.
Petsev, Nikolai Dimitrov; Leal, L. Gary; Shell, M. Scott
2017-12-21
Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely-resolved (e.g. molecular dynamics) and coarse-grained (e.g. continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 84115 (2016)], simulatedmore » using a particle-based continuum method known as smoothed dissipative particle dynamics (SDPD). An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petsev, Nikolai Dimitrov; Leal, L. Gary; Shell, M. Scott
Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely-resolved (e.g. molecular dynamics) and coarse-grained (e.g. continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 84115 (2016)], simulatedmore » using a particle-based continuum method known as smoothed dissipative particle dynamics (SDPD). An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.« less
Application of Discrete Fracture Modeling and Upscaling Techniques to Complex Fractured Reservoirs
NASA Astrophysics Data System (ADS)
Karimi-Fard, M.; Lapene, A.; Pauget, L.
2012-12-01
During the last decade, an important effort has been made to improve data acquisition (seismic and borehole imaging) and workflow for reservoir characterization which has greatly benefited the description of fractured reservoirs. However, the geological models resulting from the interpretations need to be validated or calibrated against dynamic data. Flow modeling in fractured reservoirs remains a challenge due to the difficulty of representing mass transfers at different heterogeneity scales. The majority of the existing approaches are based on dual continuum representation where the fracture network and the matrix are represented separately and their interactions are modeled using transfer functions. These models are usually based on idealized representation of the fracture distribution which makes the integration of real data difficult. In recent years, due to increases in computer power, discrete fracture modeling techniques (DFM) are becoming popular. In these techniques the fractures are represented explicitly allowing the direct use of data. In this work we consider the DFM technique developed by Karimi-Fard et al. [1] which is based on an unstructured finite-volume discretization. The mass flux between two adjacent control-volumes is evaluated using an optimized two-point flux approximation. The result of the discretization is a list of control-volumes with the associated pore-volumes and positions, and a list of connections with the associated transmissibilities. Fracture intersections are simplified using a connectivity transformation which contributes considerably to the efficiency of the methodology. In addition, the method is designed for general purpose simulators and any connectivity based simulator can be used for flow simulations. The DFM technique is either used standalone or as part of an upscaling technique. The upscaling techniques are required for large reservoirs where the explicit representation of all fractures and faults is not possible. Karimi-Fard et al. [2] have developed an upscaling technique based on DFM representation. The original version of this technique was developed to construct a dual-porosity model from a discrete fracture description. This technique has been extended and generalized so it can be applied to a wide range of problems from reservoirs with a few or no fracture to highly fractured reservoirs. In this work, we present the application of these techniques to two three-dimensional fractured reservoirs constructed using real data. The first model contains more than 600 medium and large scale fractures. The fractures are not always connected which requires a general modeling technique. The reservoir has 50 wells (injectors and producers) and water flooding simulations are performed. The second test case is a larger reservoir with sparsely distributed faults. Single-phase simulations are performed with 5 producing wells. [1] Karimi-Fard M., Durlofsky L.J., and Aziz K. 2004. An efficient discrete-fracture model applicable for general-purpose reservoir simulators. SPE Journal, 9(2): 227-236. [2] Karimi-Fard M., Gong B., and Durlofsky L.J. 2006. Generation of coarse-scale continuum flow models from detailed fracture characterizations. Water Resources Research, 42(10): W10423.
Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo
2018-02-01
The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Cunningham, Kevin; Hill, Melissa A.
2013-01-01
Flight test and modeling techniques were developed for efficiently identifying global aerodynamic models that can be used to accurately simulate stall, upset, and recovery on large transport airplanes. The techniques were developed and validated in a high-fidelity fixed-base flight simulator using a wind-tunnel aerodynamic database, realistic sensor characteristics, and a realistic flight deck representative of a large transport aircraft. Results demonstrated that aerodynamic models for stall, upset, and recovery can be identified rapidly and accurately using relatively simple piloted flight test maneuvers. Stall maneuver predictions and comparisons of identified aerodynamic models with data from the underlying simulation aerodynamic database were used to validate the techniques.
Computer Simulation as an Aid for Management of an Information System.
ERIC Educational Resources Information Center
Simmonds, W. H.; And Others
The aim of this study was to develop methods, based upon computer simulation, of designing information systems and illustrate the use of these methods by application to an information service. The method developed is based upon Monte Carlo and discrete event simulation techniques and is described in an earlier report - Sira report R412 Organizing…
Allen, Edwin B; Walls, Richard T; Reilly, Frank D
2008-02-01
This study investigated the effects of interactive instructional techniques in a web-based peripheral nervous system (PNS) component of a first year medical school human anatomy course. Existing data from 9 years of instruction involving 856 students were used to determine (1) the effect of web-based interactive instructional techniques on written exam item performance and (2) differences between student opinions of the benefit level of five different types of interactive learning objects used. The interactive learning objects included Patient Case studies, review Games, Simulated Interactive Patients (SIP), Flashcards, and unit Quizzes. Exam item analysis scores were found to be significantly higher (p < 0.05) for students receiving the instructional treatment incorporating the web-based interactive learning objects than for students not receiving this treatment. Questionnaires using a five-point Likert scale were analysed to determine student opinion ratings of the interactive learning objects. Students reported favorably on the benefit level of all learning objects. Students rated the benefit level of the Simulated Interactive Patients (SIP) highest, and this rating was significantly higher (p < 0.05) than all other learning objects. This study suggests that web-based interactive instructional techniques improve student exam performance. Students indicated a strong acceptance of Simulated Interactive Patient learning objects.
Peter, Emanuel K; Shea, Joan-Emma; Pivkin, Igor V
2016-05-14
In this paper, we present a coarse replica exchange molecular dynamics (REMD) approach, based on kinetic Monte Carlo (kMC). The new development significantly can reduce the amount of replicas and the computational cost needed to enhance sampling in protein simulations. We introduce 2 different methods which primarily differ in the exchange scheme between the parallel ensembles. We apply this approach on folding of 2 different β-stranded peptides: the C-terminal β-hairpin fragment of GB1 and TrpZip4. Additionally, we use the new simulation technique to study the folding of TrpCage, a small fast folding α-helical peptide. Subsequently, we apply the new methodology on conformation changes in signaling of the light-oxygen voltage (LOV) sensitive domain from Avena sativa (AsLOV2). Our results agree well with data reported in the literature. In simulations of dialanine, we compare the statistical sampling of the 2 techniques with conventional REMD and analyze their performance. The new techniques can reduce the computational cost of REMD significantly and can be used in enhanced sampling simulations of biomolecules.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. National Center for Research in Vocational Education.
One of a series of performance-based teacher education learning packages focusing upon specific professional competencies of vocational teachers, this learning module deals with employing simulation techniques. It consists of an introduction and four learning experiences. Covered in the first learning experience are various types of simulation…
Iterative repair for scheduling and rescheduling
NASA Technical Reports Server (NTRS)
Zweben, Monte; Davis, Eugene; Deale, Michael
1991-01-01
An iterative repair search method is described called constraint based simulated annealing. Simulated annealing is a hill climbing search technique capable of escaping local minima. The utility of the constraint based framework is shown by comparing search performance with and without the constraint framework on a suite of randomly generated problems. Results are also shown of applying the technique to the NASA Space Shuttle ground processing problem. These experiments show that the search methods scales to complex, real world problems and reflects interesting anytime behavior.
NASA Astrophysics Data System (ADS)
Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em
2017-09-01
Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.
Numerical simulation of steady cavitating flow of viscous fluid in a Francis hydroturbine
NASA Astrophysics Data System (ADS)
Panov, L. V.; Chirkov, D. V.; Cherny, S. G.; Pylev, I. M.; Sotnikov, A. A.
2012-09-01
Numerical technique was developed for simulation of cavitating flows through the flow passage of a hydraulic turbine. The technique is based on solution of steady 3D Navier—Stokes equations with a liquid phase transfer equation. The approch for setting boundary conditions meeting the requirements of cavitation testing standard was suggested. Four different models of evaporation and condensation were compared. Numerical simulations for turbines of different specific speed were compared with experiment.
CFAVC scheme for high frequency series resonant inverter-fed domestic induction heating system
NASA Astrophysics Data System (ADS)
Nagarajan, Booma; Reddy Sathi, Rama
2016-01-01
This article presents the investigations on the constant frequency asymmetric voltage cancellation control in the AC-AC resonant converter-fed domestic induction heating system. Conventional fixed frequency control techniques used in the high frequency converters lead to non-zero voltage switching operation and reduced output power. The proposed control technique produces higher output power than the conventional fixed-frequency control strategies. In this control technique, zero-voltage-switching operation is maintained during different duty cycle operation for reduction in the switching losses. Complete analysis of the induction heating power supply system with asymmetric voltage cancellation control is discussed in this article. Simulation and experimental study on constant frequency asymmetric voltage cancellation (CFAVC)-controlled full bridge series resonant inverter is performed. Time domain simulation results for the open and closed loop of the system are obtained using MATLAB simulation tool. The simulation results prove the control of voltage and power in a wide range. PID controller-based closed loop control system achieves the voltage regulation of the proposed system for the step change in load. Hardware implementation of the system under CFAVC control is done using the embedded controller. The simulation and experimental results validate the performance of the CFAVC control technique for series resonant-based induction cooking system.
A Review of Endoscopic Simulation: Current Evidence on Simulators and Curricula.
King, Neil; Kunac, Anastasia; Merchant, Aziz M
2016-01-01
Upper and lower endoscopy is an important tool that is being utilized more frequently by general surgeons. Training in therapeutic endoscopic techniques has become a mandatory requirement for general surgery residency programs in the United States. The Fundamentals of Endoscopic Surgery has been developed to train and assess competency in these advanced techniques. Simulation has been shown to increase the skill and learning curve of trainees in other surgical disciplines. Several types of endoscopy simulators are commercially available; mechanical trainers, animal based, and virtual reality or computer-based simulators all have their benefits and limitations. However they have all been shown to improve trainee's endoscopic skills. Endoscopic simulators will play a critical role as part of a comprehensive curriculum designed to train the next generation of surgeons. We reviewed recent literature related to the various types of endoscopic simulators and their use in an educational curriculum, and discuss the relevant findings. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Applying Parallel Processing Techniques to Tether Dynamics Simulation
NASA Technical Reports Server (NTRS)
Wells, B. Earl
1996-01-01
The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.
Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO
NASA Technical Reports Server (NTRS)
Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael
2014-01-01
For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.
ERIC Educational Resources Information Center
Fauzi, Ahmad; Bundu, Patta; Tahmir, Suradi
2016-01-01
Bridge simulator constitutes a very fundamental and vital tool to trigger and ensure that seamen or seafarers possess the standardized competence required. By using the bridge simulator technique, a reality based study can be presented easily and delivered to the students in ongoing basis to their classroom or study place. Afterwards, the validity…
Expanded Processing Techniques for EMI Systems
2012-07-01
possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and mapping...possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and...54! Figure 4.25: Plots of simulated MetalMapper data for two oblate spheroidal targets
Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems
NASA Technical Reports Server (NTRS)
Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael
2013-01-01
The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.
Numerical Simulation of Delamination Growth in Composite Materials
NASA Technical Reports Server (NTRS)
Camanho, P. P.; Davila, C. G.; Ambur, D. R.
2001-01-01
The use of decohesion elements for the simulation of delamination in composite materials is reviewed. The test methods available to measure the interfacial fracture toughness used in the formulation of decohesion elements are described initially. After a brief presentation of the virtual crack closure technique, the technique most widely used to simulate delamination growth, the formulation of interfacial decohesion elements is described. Problems related with decohesion element constitutive equations, mixed-mode crack growth, element numerical integration and solution procedures are discussed. Based on these investigations, it is concluded that the use of interfacial decohesion elements is a promising technique that avoids the need for a pre-existing crack and pre-defined crack paths, and that these elements can be used to simulate both delamination onset and growth.
Shao, Yu; Wang, Shumin
2016-12-01
The numerical simulation of acoustic scattering from elastic objects near a water-sand interface is critical to underwater target identification. Frequency-domain methods are computationally expensive, especially for large-scale broadband problems. A numerical technique is proposed to enable the efficient use of finite-difference time-domain method for broadband simulations. By incorporating a total-field/scattered-field boundary, the simulation domain is restricted inside a tightly bounded region. The incident field is further synthesized by the Fourier transform for both subcritical and supercritical incidences. Finally, the scattered far field is computed using a half-space Green's function. Numerical examples are further provided to demonstrate the accuracy and efficiency of the proposed technique.
NASA Astrophysics Data System (ADS)
Jumadi, Nur Anida; Beng, Gan Kok; Ali, Mohd Alauddin Mohd; Zahedi, Edmond; Morsin, Marlia
2017-09-01
The implementation of surface-based Monte Carlo simulation technique for oxygen saturation (SaO2) calibration curve estimation is demonstrated in this paper. Generally, the calibration curve is estimated either from the empirical study using animals as the subject of experiment or is derived from mathematical equations. However, the determination of calibration curve using animal is time consuming and requires expertise to conduct the experiment. Alternatively, an optical simulation technique has been used widely in the biomedical optics field due to its capability to exhibit the real tissue behavior. The mathematical relationship between optical density (OD) and optical density ratios (ODR) associated with SaO2 during systole and diastole is used as the basis of obtaining the theoretical calibration curve. The optical properties correspond to systolic and diastolic behaviors were applied to the tissue model to mimic the optical properties of the tissues. Based on the absorbed ray flux at detectors, the OD and ODR were successfully calculated. The simulation results of optical density ratio occurred at every 20 % interval of SaO2 is presented with maximum error of 2.17 % when comparing it with previous numerical simulation technique (MC model). The findings reveal the potential of the proposed method to be used for extended calibration curve study using other wavelength pair.
Demonstration of innovative techniques for work zone safety data analysis
DOT National Transportation Integrated Search
2009-07-15
Based upon the results of the simulator data analysis, additional future research can be : identified to validate the driving simulator in terms of similarities with Ohio work zones. For : instance, the speeds observed in the simulator were greater f...
Dshell++: A Component Based, Reusable Space System Simulation Framework
NASA Technical Reports Server (NTRS)
Lim, Christopher S.; Jain, Abhinandan
2009-01-01
This paper describes the multi-mission Dshell++ simulation framework for high fidelity, physics-based simulation of spacecraft, robotic manipulation and mobility systems. Dshell++ is a C++/Python library which uses modern script driven object-oriented techniques to allow component reuse and a dynamic run-time interface for complex, high-fidelity simulation of spacecraft and robotic systems. The goal of the Dshell++ architecture is to manage the inherent complexity of physicsbased simulations while supporting component model reuse across missions. The framework provides several features that support a large degree of simulation configurability and usability.
Post-coronagraphic tip-tilt sensing for vortex phase masks: The QACITS technique
NASA Astrophysics Data System (ADS)
Huby, E.; Baudoz, P.; Mawet, D.; Absil, O.
2015-12-01
Context. Small inner working angle coronagraphs, such as the vortex phase mask, are essential to exploit the full potential of ground-based telescopes in the context of exoplanet detection and characterization. However, the drawback of this attractive feature is a high sensitivity to pointing errors, which degrades the performance of the coronagraph. Aims: We propose a tip-tilt retrieval technique based on the analysis of the final coronagraphic image, hereafter called Quadrant Analysis of Coronagraphic Images for Tip-tilt Sensing (QACITS). Methods: Under the assumption of small phase aberrations, we show that the behavior of the vortex phase mask can be simply described from the entrance pupil to the Lyot stop plane with Zernike polynomials. This convenient formalism is used to establish the theoretical basis of the QACITS technique. We performed simulations to demonstrate the validity and limits of the technique, including the case of a centrally obstructed pupil. Results: The QACITS technique principle is validated with experimental results in the case of an unobstructed circular aperture, as well as simulations in presence of a central obstruction. The typical configuration of the Keck telescope (24% central obstruction) has been simulated with additional high order aberrations. In these conditions, our simulations show that the QACITS technique is still adapted to centrally obstructed pupils and performs tip-tilt retrieval with a precision of 5 × 10-2λ/D when wavefront errors amount to λ/ 14 rms and 10-2λ/D for λ/ 70 rms errors (with λ the wavelength and D the pupil diameter). Conclusions: We have developed and demonstrated a tip-tilt sensing technique for vortex coronagraphs. The implementation of the QACITS technique is based on the analysis of the scientific image and does not require any modification of the original setup. Current facilities equipped with a vortex phase mask can thus directly benefit from this technique to improve the contrast performance close to the axis.
Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review
Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.
2009-01-01
Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508
Large Eddy Simulations using oodlesDST
2016-01-01
Research Agency DST-Group-TR-3205 ABSTRACT The oodlesDST code is based on OpenFOAM software and performs Large Eddy Simulations of......maritime platforms using a variety of simulation techniques. He is currently using OpenFOAM software to perform both Reynolds Averaged Navier-Stokes
Risk Reduction and Training using Simulation Based Tools - 12180
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Irin P.
2012-07-01
Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less
Approaches to Classroom-Based Computational Science.
ERIC Educational Resources Information Center
Guzdial, Mark
Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…
Simulation of an Asynchronous Machine by using a Pseudo Bond Graph
NASA Astrophysics Data System (ADS)
Romero, Gregorio; Felez, Jesus; Maroto, Joaquin; Martinez, M. Luisa
2008-11-01
For engineers, computer simulation, is a basic tool since it enables them to understand how systems work without actually needing to see them. They can learn how they work in different circumstances and optimize their design with considerably less cost in terms of time and money than if they had to carry out tests on a physical system. However, if computer simulation is to be reliable it is essential for the simulation model to be validated. There is a wide range of commercial brands on the market offering products for electrical domain simulation (SPICE, LabVIEW PSCAD,Dymola, Simulink, Simplorer,...). These are powerful tools, but require the engineer to have a perfect knowledge of the electrical field. This paper shows an alternative methodology to can simulate an asynchronous machine using the multidomain Bond Graph technique and apply it in any program that permit the simulation of models based in this technique; no extraordinary knowledge of this technique and electric field are required to understand the process .
CALM: Complex Adaptive System (CAS)-Based Decision Support for Enabling Organizational Change
NASA Astrophysics Data System (ADS)
Adler, Richard M.; Koehn, David J.
Guiding organizations through transformational changes such as restructuring or adopting new technologies is a daunting task. Such changes generate workforce uncertainty, fear, and resistance, reducing morale, focus and performance. Conventional project management techniques fail to mitigate these disruptive effects, because social and individual changes are non-mechanistic, organic phenomena. CALM (for Change, Adaptation, Learning Model) is an innovative decision support system for enabling change based on CAS principles. CALM provides a low risk method for validating and refining change strategies that combines scenario planning techniques with "what-if" behavioral simulation. In essence, CALM "test drives" change strategies before rolling them out, allowing organizations to practice and learn from virtual rather than actual mistakes. This paper describes the CALM modeling methodology, including our metrics for measuring organizational readiness to respond to change and other major CALM scenario elements: prospective change strategies; alternate futures; and key situational dynamics. We then describe CALM's simulation engine for projecting scenario outcomes and its associated analytics. CALM's simulator unifies diverse behavioral simulation paradigms including: adaptive agents; system dynamics; Monte Carlo; event- and process-based techniques. CALM's embodiment of CAS dynamics helps organizations reduce risk and improve confidence and consistency in critical strategies for enabling transformations.
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
Niroomandi, S; Alfaro, I; Cueto, E; Chinesta, F
2012-01-01
Model reduction techniques have shown to constitute a valuable tool for real-time simulation in surgical environments and other fields. However, some limitations, imposed by real-time constraints, have not yet been overcome. One of such limitations is the severe limitation in time (established in 500Hz of frequency for the resolution) that precludes the employ of Newton-like schemes for solving non-linear models as the ones usually employed for modeling biological tissues. In this work we present a technique able to deal with geometrically non-linear models, based on the employ of model reduction techniques, together with an efficient non-linear solver. Examples of the performance of the technique over some examples will be given. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Simulations and Games: Use and Barriers in Higher Education
ERIC Educational Resources Information Center
Lean, Jonathan; Moizer, Jonathan; Towler, Michael; Abbey, Caroline
2006-01-01
This article explores the use of simulations and games in tertiary education. It examines the extent to which academics use different simulation-based teaching approaches and how they perceive the barriers to adopting such techniques. Following a review of the extant literature, a typology of simulations is constructed. A staff survey within a UK…
What can virtual patient simulation offer mental health nursing education?
Guise, V; Chambers, M; Välimäki, M
2012-06-01
This paper discusses the use of simulation in nursing education and training, including potential benefits and barriers associated with its use. In particular, it addresses the hitherto scant application of diverse simulation devices and dedicated simulation scenarios in psychiatric and mental health nursing. It goes on to describe a low-cost, narrative-based virtual patient simulation technique which has the potential for wide application within health and social care education. An example of the implementation of this technology in a web-based pilot course for acute mental health nurses is given. This particular virtual patient technique is a simulation type ideally suited to promoting essential mental health nursing skills such as critical thinking, communication and decision making. Furthermore, it is argued that it is particularly amenable to e-learning and blended learning environments, as well as being an apt tool where multilingual simulations are required. The continued development, implementation and evaluation of narrative virtual patient simulations across a variety of health and social care programmes would help ascertain their success as an educational tool. © 2011 Blackwell Publishing.
ERIC Educational Resources Information Center
Koka, Andre
2017-01-01
This study examined the effectiveness of a brief theory-based intervention on muscular strength among adolescents in a physical education setting. The intervention adopted a process-based mental simulation technique. The self-reported frequency of practising for and actual levels of abdominal muscular strength/endurance as one component of…
Acoustic Parametric Array for Identifying Standoff Targets
NASA Astrophysics Data System (ADS)
Hinders, M. K.; Rudd, K. E.
2010-02-01
An integrated simulation method for investigating nonlinear sound beams and 3D acoustic scattering from any combination of complicated objects is presented. A standard finite-difference simulation method is used to model pulsed nonlinear sound propagation from a source to a scattering target via the KZK equation. Then, a parallel 3D acoustic simulation method based on the finite integration technique is used to model the acoustic wave interaction with the target. Any combination of objects and material layers can be placed into the 3D simulation space to study the resulting interaction. Several example simulations are presented to demonstrate the simulation method and 3D visualization techniques. The combined simulation method is validated by comparing experimental and simulation data and a demonstration of how this combined simulation method assisted in the development of a nonlinear acoustic concealed weapons detector is also presented.
Sui, Yuan; Pan, Jun J; Qin, Hong; Liu, Hao; Lu, Yun
2017-12-01
Laparoscopic surgery (LS), also referred to as minimally invasive surgery, is a modern surgical technique which is widely applied. The fulcrum effect makes LS a non-intuitive motor skill with a steep learning curve. A hybrid model of tetrahedrons and a multi-layer triangular mesh are constructed to simulate the deformable behavior of the rectum and surrounding tissues in the Position-Based Dynamics (PBD) framework. A heat-conduction based electric-burn technique is employed to simulate the electrocautery procedure. The simulator has been applied for laparoscopic rectum cancer surgery training. From the experimental results, trainees can operate in real time with high degrees of stability and fidelity. A preliminary study was performed to evaluate the realism and usefulness. This prototype simulator has been tested and verified by colorectal surgeons through a pilot study. They believed both the visual and the haptic performance of the simulation are realistic and helpful to enhance laparoscopic skills. Copyright © 2017 John Wiley & Sons, Ltd.
Space construction base control system
NASA Technical Reports Server (NTRS)
Kaczynski, R. F.
1979-01-01
Several approaches for an attitude control system are studied and developed for a large space construction base that is structurally flexible. Digital simulations were obtained using the following techniques: (1) the multivariable Nyquist array method combined with closed loop pole allocation, (2) the linear quadratic regulator method. Equations for the three-axis simulation using the multilevel control method were generated and are presented. Several alternate control approaches are also described. A technique is demonstrated for obtaining the dynamic structural properties of a vehicle which is constructed of two or more submodules of known dynamic characteristics.
An adaptive front tracking technique for three-dimensional transient flows
NASA Astrophysics Data System (ADS)
Galaktionov, O. S.; Anderson, P. D.; Peters, G. W. M.; van de Vosse, F. N.
2000-01-01
An adaptive technique, based on both surface stretching and surface curvature analysis for tracking strongly deforming fluid volumes in three-dimensional flows is presented. The efficiency and accuracy of the technique are demonstrated for two- and three-dimensional flow simulations. For the two-dimensional test example, the results are compared with results obtained using a different tracking approach based on the advection of a passive scalar. Although for both techniques roughly the same structures are found, the resolution for the front tracking technique is much higher. In the three-dimensional test example, a spherical blob is tracked in a chaotic mixing flow. For this problem, the accuracy of the adaptive tracking is demonstrated by the volume conservation for the advected blob. Adaptive front tracking is suitable for simulation of the initial stages of fluid mixing, where the interfacial area can grow exponentially with time. The efficiency of the algorithm significantly benefits from parallelization of the code. Copyright
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
Kim, K B; Shanyfelt, L M; Hahn, D W
2006-01-01
Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.
Translating the Simulation of Procedural Drilling Techniques for Interactive Neurosurgical Training
Stredney, Don; Rezai, Ali R.; Prevedello, Daniel M.; Elder, J. Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J.
2014-01-01
Background Through previous and concurrent efforts, we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. This volumetric data helps drive an interactive multi-sensory, i.e., visual (stereo), aural (stereo), and tactile simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the CNS simulation initiative. Objective The goal of this multi-level development is to deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. Methods We discuss issues of biofidelity as well as our methods to provide objective, quantitative automated assessment for the residents. Results We conclude with a discussion of our experiences by reporting on preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. Conclusion We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum. PMID:24051887
Webster, Victoria A; Nieto, Santiago G; Grosberg, Anna; Akkus, Ozan; Chiel, Hillel J; Quinn, Roger D
2016-10-01
In this study, new techniques for approximating the contractile properties of cells in biohybrid devices using Finite Element Analysis (FEA) have been investigated. Many current techniques for modeling biohybrid devices use individual cell forces to simulate the cellular contraction. However, such techniques result in long simulation runtimes. In this study we investigated the effect of the use of thermal contraction on simulation runtime. The thermal contraction model was significantly faster than models using individual cell forces, making it beneficial for rapidly designing or optimizing devices. Three techniques, Stoney׳s Approximation, a Modified Stoney׳s Approximation, and a Thermostat Model, were explored for calibrating thermal expansion/contraction parameters (TECPs) needed to simulate cellular contraction using thermal contraction. The TECP values were calibrated by using published data on the deflections of muscular thin films (MTFs). Using these techniques, TECP values that suitably approximate experimental deflections can be determined by using experimental data obtained from cardiomyocyte MTFs. Furthermore, a sensitivity analysis was performed in order to investigate the contribution of individual variables, such as elastic modulus and layer thickness, to the final calibrated TECP for each calibration technique. Additionally, the TECP values are applicable to other types of biohybrid devices. Two non-MTF models were simulated based on devices reported in the existing literature. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vencels, Juris; Delzanno, Gian Luca; Johnson, Alec
2015-06-01
A spectral method for kinetic plasma simulations based on the expansion of the velocity distribution function in a variable number of Hermite polynomials is presented. The method is based on a set of non-linear equations that is solved to determine the coefficients of the Hermite expansion satisfying the Vlasov and Poisson equations. In this paper, we first show that this technique combines the fluid and kinetic approaches into one framework. Second, we present an adaptive strategy to increase and decrease the number of Hermite functions dynamically during the simulation. The technique is applied to the Landau damping and two-stream instabilitymore » test problems. Performance results show 21% and 47% saving of total simulation time in the Landau and two-stream instability test cases, respectively.« less
NASA Technical Reports Server (NTRS)
Goel, Narendra S.; Rozehnal, Ivan; Thompson, Richard L.
1991-01-01
A computer-graphics-based model, named DIANA, is presented for generation of objects of arbitrary shape and for calculating bidirectional reflectances and scattering from them, in the visible and infrared region. The computer generation is based on a modified Lindenmayer system approach which makes it possible to generate objects of arbitrary shapes and to simulate their growth, dynamics, and movement. Rendering techniques are used to display an object on a computer screen with appropriate shading and shadowing and to calculate the scattering and reflectance from the object. The technique is illustrated with scattering from canopies of simulated corn plants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, Christina M. L.; Palmeri, Mark L.; Department of Anesthesiology, Duke University Medical Center, Durham, North Carolina 27710
2013-04-15
Purpose: The authors previously reported on a three-dimensional computer-generated breast phantom, based on empirical human image data, including a realistic finite-element based compression model that was capable of simulating multimodality imaging data. The computerized breast phantoms are a hybrid of two phantom generation techniques, combining empirical breast CT (bCT) data with flexible computer graphics techniques. However, to date, these phantoms have been based on single human subjects. In this paper, the authors report on a new method to generate multiple phantoms, simulating additional subjects from the limited set of original dedicated breast CT data. The authors developed an image morphingmore » technique to construct new phantoms by gradually transitioning between two human subject datasets, with the potential to generate hundreds of additional pseudoindependent phantoms from the limited bCT cases. The authors conducted a preliminary subjective assessment with a limited number of observers (n= 4) to illustrate how realistic the simulated images generated with the pseudoindependent phantoms appeared. Methods: Several mesh-based geometric transformations were developed to generate distorted breast datasets from the original human subject data. Segmented bCT data from two different human subjects were used as the 'base' and 'target' for morphing. Several combinations of transformations were applied to morph between the 'base' and 'target' datasets such as changing the breast shape, rotating the glandular data, and changing the distribution of the glandular tissue. Following the morphing, regions of skin and fat were assigned to the morphed dataset in order to appropriately assign mechanical properties during the compression simulation. The resulting morphed breast was compressed using a finite element algorithm and simulated mammograms were generated using techniques described previously. Sixty-two simulated mammograms, generated from morphing three human subject datasets, were used in a preliminary observer evaluation where four board certified breast radiologists with varying amounts of experience ranked the level of realism (from 1 ='fake' to 10 ='real') of the simulated images. Results: The morphing technique was able to successfully generate new and unique morphed datasets from the original human subject data. The radiologists evaluated the realism of simulated mammograms generated from the morphed and unmorphed human subject datasets and scored the realism with an average ranking of 5.87 {+-} 1.99, confirming that overall the phantom image datasets appeared more 'real' than 'fake.' Moreover, there was not a significant difference (p > 0.1) between the realism of the unmorphed datasets (6.0 {+-} 1.95) compared to the morphed datasets (5.86 {+-} 1.99). Three of the four observers had overall average rankings of 6.89 {+-} 0.89, 6.9 {+-} 1.24, 6.76 {+-} 1.22, whereas the fourth observer ranked them noticeably lower at 2.94 {+-} 0.7. Conclusions: This work presents a technique that can be used to generate a suite of realistic computerized breast phantoms from a limited number of human subjects. This suite of flexible breast phantoms can be used for multimodality imaging research to provide a known truth while concurrently producing realistic simulated imaging data.« less
Analytical evaluation of two motion washout techniques
NASA Technical Reports Server (NTRS)
Young, L. R.
1977-01-01
Practical tools were developed which extend the state of the art of moving base flight simulation for research and training purposes. The use of visual and vestibular cues to minimize the actual motion of the simulator itself was a primary consideration. The investigation consisted of optimum programming of motion cues based on a physiological model of the vestibular system to yield 'ideal washout logic' for any given simulator constraints.
Biomechanical testing simulation of a cadaver spine specimen: development and evaluation study.
Ahn, Hyung Soo; DiAngelo, Denis J
2007-05-15
This article describes a computer model of the cadaver cervical spine specimen and virtual biomechanical testing. To develop a graphics-oriented, multibody model of a cadaver cervical spine and to build a virtual laboratory simulator for the biomechanical testing using physics-based dynamic simulation techniques. Physics-based computer simulations apply the laws of physics to solid bodies with defined material properties. This technique can be used to create a virtual simulator for the biomechanical testing of a human cadaver spine. An accurate virtual model and simulation would complement tissue-based in vitro studies by providing a consistent test bed with minimal variability and by reducing cost. The geometry of cervical vertebrae was created from computed tomography images. Joints linking adjacent vertebrae were modeled as a triple-joint complex, comprised of intervertebral disc joints in the anterior region, 2 facet joints in the posterior region, and the surrounding ligament structure. A virtual laboratory simulation of an in vitro testing protocol was performed to evaluate the model responses during flexion, extension, and lateral bending. For kinematic evaluation, the rotation of motion segment unit, coupling behaviors, and 3-dimensional helical axes of motion were analyzed. The simulation results were in correlation with the findings of in vitro tests and published data. For kinetic evaluation, the forces of the intervertebral discs and facet joints of each segment were determined and visually animated. This methodology produced a realistic visualization of in vitro experiment, and allowed for the analyses of the kinematics and kinetics of the cadaver cervical spine. With graphical illustrations and animation features, this modeling technique has provided vivid and intuitive information.
Optimization of Turbine Blade Design for Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Shyy, Wei
1998-01-01
To facilitate design optimization of turbine blade shape for reusable launching vehicles, appropriate techniques need to be developed to process and estimate the characteristics of the design variables and the response of the output with respect to the variations of the design variables. The purpose of this report is to offer insight into developing appropriate techniques for supporting such design and optimization needs. Neural network and polynomial-based techniques are applied to process aerodynamic data obtained from computational simulations for flows around a two-dimensional airfoil and a generic three- dimensional wing/blade. For the two-dimensional airfoil, a two-layered radial-basis network is designed and trained. The performances of two different design functions for radial-basis networks, one based on the accuracy requirement, whereas the other one based on the limit on the network size. While the number of neurons needed to satisfactorily reproduce the information depends on the size of the data, the neural network technique is shown to be more accurate for large data set (up to 765 simulations have been used) than the polynomial-based response surface method. For the three-dimensional wing/blade case, smaller aerodynamic data sets (between 9 to 25 simulations) are considered, and both the neural network and the polynomial-based response surface techniques improve their performance as the data size increases. It is found while the relative performance of two different network types, a radial-basis network and a back-propagation network, depends on the number of input data, the number of iterations required for radial-basis network is less than that for the back-propagation network.
NASA Technical Reports Server (NTRS)
Coon, Craig R.; Cardullo, Frank M.; Zaychik, Kirill B.
2014-01-01
The ability to develop highly advanced simulators is a critical need that has the ability to significantly impact the aerospace industry. The aerospace industry is advancing at an ever increasing pace and flight simulators must match this development with ever increasing urgency. In order to address both current problems and potential advancements with flight simulator techniques, several aspects of current control law technology of the National Aeronautics and Space Administration (NASA) Langley Research Center's Cockpit Motion Facility (CMF) motion base simulator were examined. Preliminary investigation of linear models based upon hardware data were examined to ensure that the most accurate models are used. This research identified both system improvements in the bandwidth and more reliable linear models. Advancements in the compensator design were developed and verified through multiple techniques. The position error rate feedback, the acceleration feedback and the force feedback were all analyzed in the heave direction using the nonlinear model of the hardware. Improvements were made using the position error rate feedback technique. The acceleration feedback compensator also provided noteworthy improvement, while attempts at implementing a force feedback compensator proved unsuccessful.
A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor
NASA Technical Reports Server (NTRS)
Rao, Hariprasad Nannapaneni
1989-01-01
The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.
Skin fluorescence model based on the Monte Carlo technique
NASA Astrophysics Data System (ADS)
Churmakov, Dmitry Y.; Meglinski, Igor V.; Piletsky, Sergey A.; Greenhalgh, Douglas A.
2003-10-01
The novel Monte Carlo technique of simulation of spatial fluorescence distribution within the human skin is presented. The computational model of skin takes into account spatial distribution of fluorophores following the collagen fibers packing, whereas in epidermis and stratum corneum the distribution of fluorophores assumed to be homogeneous. The results of simulation suggest that distribution of auto-fluorescence is significantly suppressed in the NIR spectral region, while fluorescence of sensor layer embedded in epidermis is localized at the adjusted depth. The model is also able to simulate the skin fluorescence spectra.
Szostek, Kamil; Piórkowski, Adam
2016-10-01
Ultrasound (US) imaging is one of the most popular techniques used in clinical diagnosis, mainly due to lack of adverse effects on patients and the simplicity of US equipment. However, the characteristics of the medium cause US imaging to imprecisely reconstruct examined tissues. The artifacts are the results of wave phenomena, i.e. diffraction or refraction, and should be recognized during examination to avoid misinterpretation of an US image. Currently, US training is based on teaching materials and simulators and ultrasound simulation has become an active research area in medical computer science. Many US simulators are limited by the complexity of the wave phenomena, leading to intensive sophisticated computation that makes it difficult for systems to operate in real time. To achieve the required frame rate, the vast majority of simulators reduce the problem of wave diffraction and refraction. The following paper proposes a solution for an ultrasound simulator based on methods known in geophysics. To improve simulation quality, a wavefront construction method was adapted which takes into account the refraction phenomena. This technique uses ray tracing and velocity averaging to construct wavefronts in the simulation. Instead of a geological medium, real CT scans are applied. This approach can produce more realistic projections of pathological findings and is also capable of providing real-time simulation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
GPU-based efficient realistic techniques for bleeding and smoke generation in surgical simulators.
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-12-01
In actual surgery, smoke and bleeding due to cauterization processes provide important visual cues to the surgeon, which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated the effects of bleeding and smoke generation, they are not realistic due to the requirement of real-time performance. To be interactive, visual update must be performed at at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques, since other computationally intensive processes compete for the available Central Processing Unit (CPU) resources. In this study we developed a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators, which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. The smoke and bleeding simulation were implemented as part of a laparoscopic adjustable gastric banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur noticeable overhead. However, for smoke generation, an input/output (I/O) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited to VR-based surgical simulators. Copyright © 2010 John Wiley & Sons, Ltd.
GPU-based Efficient Realistic Techniques for Bleeding and Smoke Generation in Surgical Simulators
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-01-01
Background In actual surgery, smoke and bleeding due to cautery processes, provide important visual cues to the surgeon which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated effects of bleeding and smoke generation, they are not realistic due to the requirement of real time performance. To be interactive, visual update must be performed at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques since other computationally intensive processes compete for the available CPU resources. Methods In this work, we develop a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. Results The smoke and bleeding simulation were implemented as part of a Laparoscopic Adjustable Gastric Banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur in noticeable overhead. However, for smoke generation, an I/O (Input/Output) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Conclusions Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited in VR-based surgical simulators. PMID:20878651
NASA Technical Reports Server (NTRS)
Connor, S. A.; Wierwille, W. W.
1983-01-01
A comparison of the sensitivity and intrusion of twenty pilot workload assessment techniques was conducted using a psychomotor loading task in a three degree of freedom moving base aircraft simulator. The twenty techniques included opinion measures, spare mental capacity measures, physiological measures, eye behavior measures, and primary task performance measures. The primary task was an instrument landing system (ILS) approach and landing. All measures were recorded between the outer marker and the middle marker on the approach. Three levels (low, medium, and high) of psychomotor load were obtained by the combined manipulation of windgust disturbance level and simulated aircraft pitch stability. Six instrument rated pilots participated in four seasons lasting approximately three hours each.
Estimating School Efficiency: A Comparison of Methods Using Simulated Data.
ERIC Educational Resources Information Center
Bifulco, Robert; Bretschneider, Stuart
2001-01-01
Uses simulated data to assess the adequacy of two econometric and linear-programming techniques (data-envelopment analysis and corrected ordinary least squares) for measuring performance-based school reform. In complex data sets (simulated to contain measurement error and endogeneity), these methods are inadequate efficiency measures. (Contains 40…
Efficient morse decompositions of vector fields.
Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene
2008-01-01
Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.
NASA Astrophysics Data System (ADS)
Barnes, Brian C.; Leiter, Kenneth W.; Becker, Richard; Knap, Jaroslaw; Brennan, John K.
2017-07-01
We describe the development, accuracy, and efficiency of an automation package for molecular simulation, the large-scale atomic/molecular massively parallel simulator (LAMMPS) integrated materials engine (LIME). Heuristics and algorithms employed for equation of state (EOS) calculation using a particle-based model of a molecular crystal, hexahydro-1,3,5-trinitro-s-triazine (RDX), are described in detail. The simulation method for the particle-based model is energy-conserving dissipative particle dynamics, but the techniques used in LIME are generally applicable to molecular dynamics simulations with a variety of particle-based models. The newly created tool set is tested through use of its EOS data in plate impact and Taylor anvil impact continuum simulations of solid RDX. The coarse-grain model results from LIME provide an approach to bridge the scales from atomistic simulations to continuum simulations.
Tensor-product preconditioners for a space-time discontinuous Galerkin method
NASA Astrophysics Data System (ADS)
Diosady, Laslo T.; Murman, Scott M.
2014-10-01
A space-time discontinuous Galerkin spectral element discretization is presented for direct numerical simulation of the compressible Navier-Stokes equations. An efficient solution technique based on a matrix-free Newton-Krylov method is presented. A diagonalized alternating direction implicit preconditioner is extended to a space-time formulation using entropy variables. The effectiveness of this technique is demonstrated for the direct numerical simulation of turbulent flow in a channel.
Finite Element Analysis of Lamb Waves Acting within a Thin Aluminum Plate
2007-09-01
signal to avoid time aliasing % LambWaveMode % lamb wave mode to simulate; use proper phase velocity curve % thickness % thickness of...analysis of the simulated signal response data demonstrated that elevated temperatures delay wave propagation, although the delays are minimal at the...Echo Techniques Ultrasonic NDE techniques are based on the propagation and reflection of elastic waves , with the assumption that damage in the
Modified Dual Three-Pulse Modulation technique for single-phase inverter topology
NASA Astrophysics Data System (ADS)
Sree Harsha, N. R.; Anitha, G. S.; Sreedevi, A.
2016-01-01
In a recent paper, a new modulation technique called Dual Three Pulse Modulation (DTPM) was proposed to improve the efficiency of the power converters of the Electric/Hybrid/Fuel-cell vehicles. It was simulated in PSIM 9.0.4 and uses analog multiplexers to generate the modulating signals for the DC/DC converter and inverter. The circuit used is complex and many other simulation softwares do not support the analog multiplexers as well. Also, the DTPM technique produces modulating signals for the converter, which are essentially needed to produce the modulating signals for the inverter. Hence, it cannot be used efficiently to switch the valves of a stand-alone inverter. We propose a new method to generate the modulating signals to switch MOSFETs of a single phase Dual-Three pulse Modulation based stand-alone inverter. The circuits proposed are simulated in Multisim 12.0. We also show an alternate way to switch a DC/DC converter in a way depicted by DTPM technique both in simulation (MATLAB/Simulink) and hardware. The circuitry is relatively simple and can be used for the further investigations of DTPM technique.
NASA Technical Reports Server (NTRS)
Larson, Melora; Israelsson, Ulf E.
1995-01-01
There has been a recent increase in interest both experimentally and theoretically in the study of liquid helium very near the lambda-transition in the presence of a heat current. In traditional ground based experiments there are gravitationally induced pressure variations in any macroscopic helium sample that limit how closely the transition can be approached. We have taken advantage of the finite magnetic susceptibility of He 4 to build a magnetostrictive low gravity simulator. The simulator consists of a superconducting magnet with field profile shaped to counteract the force of gravity in a helium sample. When the magnet is operated with B x dB/dz = 21T(exp 2)/cm at the location of the cell, the gravitationally induced pressure variations will be canceled to within 1% over a volume of 0.5 cm in height and 0.5 cm in diameter. This technique for canceling the pressure variations in a long sample cell allows the lambda-transition to be studied much closer in reduced temperature and under a wider range of applied heat currents than is possible using other ground based techniques. Preliminary results using this low gravity simulator and the limitations of the magnetostrictive technique in comparison to doing space based experiments will be presented.
Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang
2016-08-01
Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.
Modeling the Performance of Direct-Detection Doppler Lidar Systems in Real Atmospheres
NASA Technical Reports Server (NTRS)
McGill, Matthew J.; Hart, William D.; McKay, Jack A.; Spinhirne, James D.
1999-01-01
Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems has assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar systems: the double-edge and the multi-channel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only about 10-20% compared to nighttime performance, provided a proper solar filter is included in the instrument design.
McGill, M J; Hart, W D; McKay, J A; Spinhirne, J D
1999-10-20
Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar system: the double-edge and the multichannel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only approximately 10-20% compared with nighttime performance, provided that a proper solar filter is included in the instrument design.
Wood lens design philosophy based on a binary additive manufacturing technique
NASA Astrophysics Data System (ADS)
Marasco, Peter L.; Bailey, Christopher
2016-04-01
Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.
NASA Astrophysics Data System (ADS)
Spangehl, Thomas; Schröder, Marc; Bodas-Salcedo, Alejandro; Glowienka-Hense, Rita; Hense, Andreas; Hollmann, Rainer; Dietzsch, Felix
2017-04-01
Decadal climate predictions are commonly evaluated focusing on geophysical parameters such as temperature, precipitation or wind speed using observational datasets and reanalysis. Alternatively, satellite based radiance measurements combined with satellite simulator techniques to deduce virtual satellite observations from the numerical model simulations can be used. The latter approach enables an evaluation in the instrument's parameter space and has the potential to reduce uncertainties on the reference side. Here we present evaluation methods focusing on forward operator techniques for the Special Sensor Microwave Imager (SSM/I). The simulator is developed as an integrated part of the CFMIP Observation Simulator Package (COSP). On the observational side the SSM/I and SSMIS Fundamental Climate Data Record (FCDR) released by CM SAF (http://dx.doi.org/10.5676/EUM_SAF_CM/FCDR_MWI/V002) is used, which provides brightness temperatures for different channels and covers the period from 1987 to 2013. The simulator is applied to hindcast simulations performed within the MiKlip project (http://fona-miklip.de) which is funded by the BMBF (Federal Ministry of Education and Research in Germany). Probabilistic evaluation results are shown based on a subset of the hindcast simulations covering the observational period.
Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.
Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo
2016-12-13
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.
Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes
Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J.; Wang, Liliang; Lin, Jianguo
2016-01-01
The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions. PMID:28060298
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...
2017-09-01
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, Ravi G.; Desjardins, Olivier; Kong, Bo
Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less
Medical Team Training Programs in Health Care
2005-01-01
simulator-based programs and classroom -based programs. Specifically, we examine the purpose and strategy of each and then review the reported empirical...evidence. In addition, for three of four classroom -based programs we report the results from a series of course observations, curriculum reviews...the-art simulators, whereas others primarily use classroom techniques. Despite these differences, all are heavily inspired by CRM and share the common
A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition
NASA Technical Reports Server (NTRS)
Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.
2012-01-01
A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.
NASA Astrophysics Data System (ADS)
Ibraheem, Omveer, Hasan, N.
2010-10-01
A new hybrid stochastic search technique is proposed to design of suboptimal AGC regulator for a two area interconnected non reheat thermal power system incorporating DC link in parallel with AC tie-line. In this technique, we are proposing the hybrid form of Genetic Algorithm (GA) and simulated annealing (SA) based regulator. GASA has been successfully applied to constrained feedback control problems where other PI based techniques have often failed. The main idea in this scheme is to seek a feasible PI based suboptimal solution at each sampling time. The feasible solution decreases the cost function rather than minimizing the cost function.
Self-Tuning of Design Variables for Generalized Predictive Control
NASA Technical Reports Server (NTRS)
Lin, Chaung; Juang, Jer-Nan
2000-01-01
Three techniques are introduced to determine the order and control weighting for the design of a generalized predictive controller. These techniques are based on the application of fuzzy logic, genetic algorithms, and simulated annealing to conduct an optimal search on specific performance indexes or objective functions. Fuzzy logic is found to be feasible for real-time and on-line implementation due to its smooth and quick convergence. On the other hand, genetic algorithms and simulated annealing are applicable for initial estimation of the model order and control weighting, and final fine-tuning within a small region of the solution space, Several numerical simulations for a multiple-input and multiple-output system are given to illustrate the techniques developed in this paper.
Logan, Heather; Wolfaardt, Johan; Boulanger, Pierre; Hodgetts, Bill; Seikaly, Hadi
2013-06-19
It is important to understand the perceived value of surgical design and simulation (SDS) amongst surgeons, as this will influence its implementation in clinical settings. The purpose of the present study was to examine the application of the convergent interview technique in the field of surgical design and simulation and evaluate whether the technique would uncover new perceptions of virtual surgical planning (VSP) and medical models not discovered by other qualitative case-based techniques. Five surgeons were asked to participate in the study. Each participant was interviewed following the convergent interview technique. After each interview, the interviewer interpreted the information by seeking agreements and disagreements among the interviewees in order to understand the key concepts in the field of SDS. Fifteen important issues were extracted from the convergent interviews. In general, the convergent interview was an effective technique in collecting information about the perception of clinicians. The study identified three areas where the technique could be improved upon for future studies in the SDS field.
2013-01-01
Background It is important to understand the perceived value of surgical design and simulation (SDS) amongst surgeons, as this will influence its implementation in clinical settings. The purpose of the present study was to examine the application of the convergent interview technique in the field of surgical design and simulation and evaluate whether the technique would uncover new perceptions of virtual surgical planning (VSP) and medical models not discovered by other qualitative case-based techniques. Methods Five surgeons were asked to participate in the study. Each participant was interviewed following the convergent interview technique. After each interview, the interviewer interpreted the information by seeking agreements and disagreements among the interviewees in order to understand the key concepts in the field of SDS. Results Fifteen important issues were extracted from the convergent interviews. Conclusion In general, the convergent interview was an effective technique in collecting information about the perception of clinicians. The study identified three areas where the technique could be improved upon for future studies in the SDS field. PMID:23782771
NASA Astrophysics Data System (ADS)
Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie
2017-12-01
In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.
Virtual reality based surgery simulation for endoscopic gynaecology.
Székely, G; Bajka, M; Brechbühler, C; Dual, J; Enzler, R; Haller, U; Hug, J; Hutter, R; Ironmonger, N; Kauer, M; Meier, V; Niederer, P; Rhomberg, A; Schmid, P; Schweitzer, G; Thaler, M; Vuskovic, V; Tröster, G
1999-01-01
Virtual reality (VR) based surgical simulator systems offer very elegant possibilities to both enrich and enhance traditional education in endoscopic surgery. However, while a wide range of VR simulator systems have been proposed and realized in the past few years, most of these systems are far from able to provide a reasonably realistic surgical environment. We explore the basic approaches to the current limits of realism and ultimately seek to extend these based on our description and analysis of the most important components of a VR-based endoscopic simulator. The feasibility of the proposed techniques is demonstrated on a first modular prototype system implementing the basic algorithms for VR-training in gynaecologic laparoscopy.
3D Fiber Orientation Simulation for Plastic Injection Molding
NASA Astrophysics Data System (ADS)
Lin, Baojiu; Jin, Xiaoshi; Zheng, Rong; Costa, Franco S.; Fan, Zhiliang
2004-06-01
Glass fiber reinforced polymer is widely used in the products made using injection molding processing. The distribution of fiber orientation inside plastic parts has direct effects on quality of molded parts. Using computer simulation to predict fiber orientation distribution is one of most efficient ways to assist engineers to do warpage analysis and to find a good design solution to produce high quality plastic parts. Fiber orientation simulation software based on 2-1/2D (midplane /Dual domain mesh) techniques has been used in industry for a decade. However, the 2-1/2D technique is based on the planar Hele-Shaw approximation and it is not suitable when the geometry has complex three-dimensional features which cannot be well approximated by 2D shells. Recently, a full 3D simulation software for fiber orientation has been developed and integrated into Moldflow Plastics Insight 3D simulation software. The theory for this new 3D fiber orientation calculation module is described in this paper. Several examples are also presented to show the benefit in using 3D fiber orientation simulation.
Cryotherapy simulator for localized prostate cancer.
Hahn, James K; Manyak, Michael J; Jin, Ge; Kim, Dongho; Rewcastle, John; Kim, Sunil; Walsh, Raymond J
2002-01-01
Cryotherapy is a treatment modality that uses a technique to selectively freeze tissue and thereby cause controlled tissue destruction. The procedure involves placement of multiple small diameter probes through the perineum into the prostate tissue at selected spatial intervals. Transrectal ultrasound is used to properly position the cylindrical probes before activation of the liquid Argon cooling element, which lowers the tissue temperature below -40 degrees Centigrade. Tissue effect is monitored by transrectal ultrasound changes as well as thermocouples placed in the tissue. The computer-based cryotherapy simulation system mimics the major surgical steps involved in the procedure. The simulated real-time ultrasound display is generated from 3-D ultrasound datasets where the interaction of the ultrasound with the instruments as well as the frozen tissue is simulated by image processing. The thermal and mechanical simulations of the tissue are done using a modified finite-difference/finite-element method optimized for real-time performance. The simulator developed is a part of a comprehensive training program, including a computer-based learning system and hands-on training program with a proctor, designed to familiarize the physician with the technique and equipment involved.
NASA Astrophysics Data System (ADS)
Trivedi, Nitin; Kumar, Manoj; Haldar, Subhasis; Deswal, S. S.; Gupta, Mridula; Gupta, R. S.
2017-09-01
A charge plasma technique based dopingless (DL) accumulation mode (AM) junctionless (JL) cylindrical surrounding gate (CSG) MOSFET has been proposed and extensively investigated. Proposed device has no physical junction at source to channel and channel to drain interface. The complete silicon pillar has been considered as undoped. The high free electron density or induced N+ region is designed by keeping the work function of source/drain metal contacts lower than the work function of undoped silicon. Thus, its fabrication complexity is drastically reduced by curbing the requirement of high temperature doping techniques. The electrical/analog characteristics for the proposed device has been extensively investigated using the numerical simulation and are compared with conventional junctionless cylindrical surrounding gate (JL-CSG) MOSFET with identical dimensions. For the numerical simulation purpose ATLAS-3D device simulator is used. The results show that the proposed device is more short channel immune to conventional JL-CSG MOSFET and suitable for faster switching applications due to higher I ON/ I OFF ratio.
Fast simulation of electromagnetic and hadronic showers in SpaCal calorimeter at the H1 experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raičević, Nataša, E-mail: raicevic@mail.desy.de; Glazov, Alexandre
2016-03-25
The fast simulation of showers induced by electrons (positrons) in the H1 lead/scintillating-fiber calorimeter, SpaCal, based on shower library technique has been presented previously. In this paper we show the results on linearity and uniformity of the reconstructed electron/positron cluster energy in electromagnetic section of Spacal for the simulations based on shower library and GFLASH shower parametrisation. The shapes of the clusters originating from photon and hadron candidates in SpaCal are analysed and experimental distributions compared with the two simulations.
Interactive physically-based sound simulation
NASA Astrophysics Data System (ADS)
Raghuvanshi, Nikunj
The realization of interactive, immersive virtual worlds requires the ability to present a realistic audio experience that convincingly compliments their visual rendering. Physical simulation is a natural way to achieve such realism, enabling deeply immersive virtual worlds. However, physically-based sound simulation is very computationally expensive owing to the high-frequency, transient oscillations underlying audible sounds. The increasing computational power of desktop computers has served to reduce the gap between required and available computation, and it has become possible to bridge this gap further by using a combination of algorithmic improvements that exploit the physical, as well as perceptual properties of audible sounds. My thesis is a step in this direction. My dissertation concentrates on developing real-time techniques for both sub-problems of sound simulation: synthesis and propagation. Sound synthesis is concerned with generating the sounds produced by objects due to elastic surface vibrations upon interaction with the environment, such as collisions. I present novel techniques that exploit human auditory perception to simulate scenes with hundreds of sounding objects undergoing impact and rolling in real time. Sound propagation is the complementary problem of modeling the high-order scattering and diffraction of sound in an environment as it travels from source to listener. I discuss my work on a novel numerical acoustic simulator (ARD) that is hundred times faster and consumes ten times less memory than a high-accuracy finite-difference technique, allowing acoustic simulations on previously-intractable spaces, such as a cathedral, on a desktop computer. Lastly, I present my work on interactive sound propagation that leverages my ARD simulator to render the acoustics of arbitrary static scenes for multiple moving sources and listener in real time, while accounting for scene-dependent effects such as low-pass filtering and smooth attenuation behind obstructions, reverberation, scattering from complex geometry and sound focusing. This is enabled by a novel compact representation that takes a thousand times less memory than a direct scheme, thus reducing memory footprints to fit within available main memory. To the best of my knowledge, this is the only technique and system in existence to demonstrate auralization of physical wave-based effects in real-time on large, complex 3D scenes.
Automatic Parameter Tuning for the Morpheus Vehicle Using Particle Swarm Optimization
NASA Technical Reports Server (NTRS)
Birge, B.
2013-01-01
A high fidelity simulation using a PC based Trick framework has been developed for Johnson Space Center's Morpheus test bed flight vehicle. There is an iterative development loop of refining and testing the hardware, refining the software, comparing the software simulation to hardware performance and adjusting either or both the hardware and the simulation to extract the best performance from the hardware as well as the most realistic representation of the hardware from the software. A Particle Swarm Optimization (PSO) based technique has been developed that increases speed and accuracy of the iterative development cycle. Parameters in software can be automatically tuned to make the simulation match real world subsystem data from test flights. Special considerations for scale, linearity, discontinuities, can be all but ignored with this technique, allowing fast turnaround both for simulation tune up to match hardware changes as well as during the test and validation phase to help identify hardware issues. Software models with insufficient control authority to match hardware test data can be immediately identified and using this technique requires very little to no specialized knowledge of optimization, freeing model developers to concentrate on spacecraft engineering. Integration of the PSO into the Morpheus development cycle will be discussed as well as a case study highlighting the tool's effectiveness.
An object-oriented simulator for 3D digital breast tomosynthesis imaging system.
Seyyedi, Saeed; Cengiz, Kubra; Kamasak, Mustafa; Yildirim, Isa
2013-01-01
Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values.
An Object-Oriented Simulator for 3D Digital Breast Tomosynthesis Imaging System
Cengiz, Kubra
2013-01-01
Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values. PMID:24371468
NASA Astrophysics Data System (ADS)
Wan, Qianwen; Panetta, Karen; Agaian, Sos
2017-05-01
Autonomous facial recognition system is widely used in real-life applications, such as homeland border security, law enforcement identification and authentication, and video-based surveillance analysis. Issues like low image quality, non-uniform illumination as well as variations in poses and facial expressions can impair the performance of recognition systems. To address the non-uniform illumination challenge, we present a novel robust autonomous facial recognition system inspired by the human visual system based, so called, logarithmical image visualization technique. In this paper, the proposed method, for the first time, utilizes the logarithmical image visualization technique coupled with the local binary pattern to perform discriminative feature extraction for facial recognition system. The Yale database, the Yale-B database and the ATT database are used for computer simulation accuracy and efficiency testing. The extensive computer simulation demonstrates the method's efficiency, accuracy, and robustness of illumination invariance for facial recognition.
Choi, Kyongsik; Chon, James W; Gu, Min; Lee, Byoungho
2007-08-20
In this paper, a simple confocal laser scanning microscopic (CLSM) image mapping technique based on the finite-difference time domain (FDTD) calculation has been proposed and evaluated for characterization of a subwavelength-scale three-dimensional (3D) void structure fabricated inside polymer matrix. The FDTD simulation method adopts a focused Gaussian beam incident wave, Berenger's perfectly matched layer absorbing boundary condition, and the angular spectrum analysis method. Through the well matched simulation and experimental results of the xz-scanned 3D void structure, we first characterize the exact position and the topological shape factor of the subwavelength-scale void structure, which was fabricated by a tightly focused ultrashort pulse laser. The proposed CLSM image mapping technique based on the FDTD can be widely applied from the 3D near-field microscopic imaging, optical trapping, and evanescent wave phenomenon to the state-of-the-art bio- and nanophotonics.
Weber, Erin L; Leland, Hyuma A; Azadgoli, Beina; Minneti, Michael; Carey, Joseph N
2017-08-01
Rehearsal is an essential part of mastering any technical skill. The efficacy of surgical rehearsal is currently limited by low fidelity simulation models. Fresh cadaver models, however, offer maximal surgical simulation. We hypothesize that preoperative surgical rehearsal using fresh tissue surgical simulation will improve resident confidence and serve as an important adjunct to current training methods. Preoperative rehearsal of surgical procedures was performed by plastic surgery residents using fresh cadavers in a simulated operative environment. Rehearsal was designed to mimic the clinical operation, complete with a surgical technician to assist. A retrospective, web-based survey was used to assess resident perception of pre- and post-procedure confidence, preparation, technique, speed, safety, and anatomical knowledge on a 5-point scale (1= not confident, 5= very confident). Twenty-six rehearsals were performed by 9 residents (PGY 1-7) an average of 4.7±2.1 days prior to performance of the scheduled operation. Surveys demonstrated a median pre-simulation confidence score of 2 and a post-rehearsal score of 4 (P<0.01). The perceived improvement in confidence and performance was greatest when simulation was performed within 3 days of the scheduled case. All residents felt that cadaveric simulation was better than standard preparation methods of self-directed reading or discussion with other surgeons. All residents believed that their technique, speed, safety, and anatomical knowledge improved as a result of simulation. Fresh tissue-based preoperative surgical rehearsal was effectively implemented in the residency program. Resident confidence and perception of technique improved. Survey results suggest that cadaveric simulation is beneficial for all levels of residents. We believe that implementation of preoperative surgical rehearsal is an effective adjunct to surgical training at all skill levels in the current environment of decreased work hours.
Protein free energy landscapes from long equilibrium simulations
NASA Astrophysics Data System (ADS)
Piana-Agostinetti, Stefano
Many computational techniques based on molecular dynamics (MD) simulation can be used to generate data to aid in the construction of protein free energy landscapes with atomistic detail. Unbiased, long, equilibrium MD simulations--although computationally very expensive--are particularly appealing, as they can provide direct kinetic and thermodynamic information on the transitions between the states that populate a protein free energy surface. It can be challenging to know how to analyze and interpret even results generated by this direct technique, however. I will discuss approaches we have employed, using equilibrium MD simulation data, to obtain descriptions of the free energy landscapes of proteins ranging in size from tens to thousands of amino acids.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
De Wilde, David; Trachet, Bram; De Meyer, Guido; Segers, Patrick
2016-09-06
Low and oscillatory wall shear stresses (WSS) near aortic bifurcations have been linked to the onset of atherosclerosis. In previous work, we calculated detailed WSS patterns in the carotid bifurcation of mice using a Fluid-structure interaction (FSI) approach. We subsequently fed the animals a high-fat diet and linked the results of the FSI simulations to those of atherosclerotic plaque location on a within-subject basis. However, these simulations were based on boundary conditions measured under anesthesia, while active mice might experience different hemodynamics. Moreover, the FSI technique for mouse-specific simulations is both time- and labor-intensive, and might be replaced by simpler and easier Computational Fluid Dynamics (CFD) simulations. The goal of the current work was (i) to compare WSS patterns based on anesthesia conditions to those representing active resting and exercising conditions; and (ii) to compare WSS patterns based on FSI simulations to those based on steady-state and transient CFD simulations. For each of the 3 computational techniques (steady state CFD, transient CFD, FSI) we performed 5 simulations: 1 for anesthesia, 2 for conscious resting conditions and 2 more for conscious active conditions. The inflow, pressure and heart rate were scaled according to representative in vivo measurements obtained from literature. When normalized by the maximal shear stress value, shear stress patterns were similar for the 3 computational techniques. For all activity levels, steady state CFD led to an overestimation of WSS values, while FSI simulations yielded a clear increase in WSS reversal at the outer side of the sinus of the external carotid artery that was not visible in transient CFD-simulations. Furthermore, the FSI simulations in the highest locomotor activity state showed a flow recirculation zone in the external carotid artery that was not present under anesthesia. This recirculation went hand in hand with locally increased WSS reversal. Our data show that FSI simulations are not necessary to obtain normalized WSS patterns, but indispensable to assess the oscillatory behavior of the WSS in mice. Flow recirculation and WSS reversal at the external carotid artery may occur during high locomotor activity while they are not present under anesthesia. These phenomena might thus influence plaque formation to a larger extent than what was previously assumed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Exploration of Force Transition in Stability Operations Using Multi-Agent Simulation
2006-09-01
risk, mission failure risk, and time in the context of the operational threat environment. The Pythagoras Multi-Agent Simulation and Data Farming...NUMBER OF PAGES 173 14. SUBJECT TERMS Stability Operations, Peace Operations, Data Farming, Pythagoras , Agent- Based Model, Multi-Agent Simulation...the operational threat environment. The Pythagoras Multi-Agent Simulation and Data Farming techniques are used to investigate force-level
Fast Simulation of Electromagnetic Showers in the ATLAS Calorimeter: Frozen Showers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barberio, E.; /Melbourne U.; Boudreau, J.
2011-11-29
One of the most time consuming process simulating pp interactions in the ATLAS detector at LHC is the simulation of electromagnetic showers in the calorimeter. In order to speed up the event simulation several parametrisation methods are available in ATLAS. In this paper we present a short description of a frozen shower technique, together with some recent benchmarks and comparison with full simulation. An expected high rate of proton-proton collisions in ATLAS detector at LHC requires large samples of simulated events (Monte Carlo) to study various physics processes. A detailed simulation of particle reactions ('full simulation') in the ATLAS detectormore » is based on GEANT4 and is very accurate. However, due to complexity of the detector, high particle multiplicity and GEANT4 itself, the average CPU time spend to simulate typical QCD event in pp collision is 20 or more minutes for modern computers. During detector simulation the largest time is spend in the calorimeters (up to 70%) most of which is required for electromagnetic particles in the electromagnetic (EM) part of the calorimeters. This is the motivation for fast simulation approaches which reduce the simulation time without affecting the accuracy. Several of fast simulation methods available within the ATLAS simulation framework (standard Athena based simulation program) are discussed here with the focus on the novel frozen shower library (FS) technique. The results obtained with FS are presented here as well.« less
Real-time simulation of biological soft tissues: a PGD approach.
Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F
2013-05-01
We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Stankovic, Ana V.
2003-01-01
Professor Stankovic will be developing and refining Simulink based models of the PM alternator and comparing the simulation results with experimental measurements taken from the unit. Her first task is to validate the models using the experimental data. Her next task is to develop alternative control techniques for the application of the Brayton Cycle PM Alternator in a nuclear electric propulsion vehicle. The control techniques will be first simulated using the validated models then tried experimentally with hardware available at NASA. Testing and simulation of a 2KW PM synchronous generator with diode bridge output is described. The parameters of a synchronous PM generator have been measured and used in simulation. Test procedures have been developed to verify the PM generator model with diode bridge output. Experimental and simulation results are in excellent agreement.
Model-Based Economic Evaluation of Treatments for Depression: A Systematic Literature Review.
Kolovos, Spyros; Bosmans, Judith E; Riper, Heleen; Chevreul, Karine; Coupé, Veerle M H; van Tulder, Maurits W
2017-09-01
An increasing number of model-based studies that evaluate the cost effectiveness of treatments for depression are being published. These studies have different characteristics and use different simulation methods. We aimed to systematically review model-based studies evaluating the cost effectiveness of treatments for depression and examine which modelling technique is most appropriate for simulating the natural course of depression. The literature search was conducted in the databases PubMed, EMBASE and PsycInfo between 1 January 2002 and 1 October 2016. Studies were eligible if they used a health economic model with quality-adjusted life-years or disability-adjusted life-years as an outcome measure. Data related to various methodological characteristics were extracted from the included studies. The available modelling techniques were evaluated based on 11 predefined criteria. This methodological review included 41 model-based studies, of which 21 used decision trees (DTs), 15 used cohort-based state-transition Markov models (CMMs), two used individual-based state-transition models (ISMs), and three used discrete-event simulation (DES) models. Just over half of the studies (54%) evaluated antidepressants compared with a control condition. The data sources, time horizons, cycle lengths, perspectives adopted and number of health states/events all varied widely between the included studies. DTs scored positively in four of the 11 criteria, CMMs in five, ISMs in six, and DES models in seven. There were substantial methodological differences between the studies. Since the individual history of each patient is important for the prognosis of depression, DES and ISM simulation methods may be more appropriate than the others for a pragmatic representation of the course of depression. However, direct comparisons between the available modelling techniques are necessary to yield firm conclusions.
NASA Astrophysics Data System (ADS)
Jonker, C. M.; Snoep, J. L.; Treur, J.; Westerhoff, H. V.; Wijngaards, W. C. A.
Within the areas of Computational Organisation Theory and Artificial Intelligence, techniques have been developed to simulate and analyse dynamics within organisations in society. Usually these modelling techniques are applied to factories and to the internal organisation of their process flows, thus obtaining models of complex organisations at various levels of aggregation. The dynamics in living cells are often interpreted in terms of well-organised processes, a bacterium being considered a (micro)factory. This suggests that organisation modelling techniques may also benefit their analysis. Using the example of Escherichia coli it is shown how indeed agent-based organisational modelling techniques can be used to simulate and analyse E.coli's intracellular dynamics. Exploiting the abstraction levels entailed by this perspective, a concise model is obtained that is readily simulated and analysed at the various levels of aggregation, yet shows the cell's essential dynamic patterns.
Finite Element Modelling and Analysis of Conventional Pultrusion Processes
NASA Astrophysics Data System (ADS)
Akishin, P.; Barkanov, E.; Bondarchuk, A.
2015-11-01
Pultrusion is one of many composite manufacturing techniques and one of the most efficient methods for producing fiber reinforced polymer composite parts with a constant cross-section. Numerical simulation is helpful for understanding the manufacturing process and developing scientific means for the pultrusion tooling design. Numerical technique based on the finite element method has been developed for the simulation of pultrusion processes. It uses the general purpose finite element software ANSYS Mechanical. It is shown that the developed technique predicts the temperature and cure profiles, which are in good agreement with those published in the open literature.
Interactive, graphical processing unitbased evaluation of evacuation scenarios at the state scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumalla, Kalyan S; Aaby, Brandon G; Yoginath, Srikanth B
2011-01-01
In large-scale scenarios, transportation modeling and simulation is severely constrained by simulation time. For example, few real- time simulators scale to evacuation traffic scenarios at the level of an entire state, such as Louisiana (approximately 1 million links) or Florida (2.5 million links). New simulation approaches are needed to overcome severe computational demands of conventional (microscopic or mesoscopic) modeling techniques. Here, a new modeling and execution methodology is explored that holds the potential to provide a tradeoff among the level of behavioral detail, the scale of transportation network, and real-time execution capabilities. A novel, field-based modeling technique and its implementationmore » on graphical processing units are presented. Although additional research with input from domain experts is needed for refining and validating the models, the techniques reported here afford interactive experience at very large scales of multi-million road segments. Illustrative experiments on a few state-scale net- works are described based on an implementation of this approach in a software system called GARFIELD. Current modeling cap- abilities and implementation limitations are described, along with possible use cases and future research.« less
Results from Binary Black Hole Simulations in Astrophysics Applications
NASA Technical Reports Server (NTRS)
Baker, John G.
2007-01-01
Present and planned gravitational wave observatories are opening a new astronomical window to the sky. A key source of gravitational waves is the merger of two black holes. The Laser Interferometer Space Antenna (LISA), in particular, is expected to observe these events with signal-to-noise ratio's in the thousands. To fully reap the scientific benefits of these observations requires a detailed understanding, based on numerical simulations, of the predictions of General Relativity for the waveform signals. New techniques for simulating binary black hole mergers, introduced two years ago, have led to dramatic advances in applied numerical simulation work. Over the last two years, numerical relativity researchers have made tremendous strides in understanding the late stages of binary black hole mergers. Simulations have been applied to test much of the basic physics of binary black hole interactions, showing robust results for merger waveform predictions, and illuminating such phenomena as spin-precession. Calculations have shown that merging systems can be kicked at up to 2500 km/s by the thrust from asymmetric emission. Recently, long lasting simulations of ten or more orbits allow tests of post-Newtonian (PN) approximation results for radiation from the last orbits of the binary's inspiral. Already, analytic waveform models based PN techniques with incorporated information from numerical simulations may be adequate for observations with current ground based observatories. As new advances in simulations continue to rapidly improve our theoretical understanding of the systems, it seems certain that high-precision predictions will be available in time for LISA and other advanced ground-based instruments. Future gravitational wave observatories are expected to make precision.
Shear wave elastography using Wigner-Ville distribution: a simulated multilayer media study.
Bidari, Pooya Sobhe; Alirezaie, Javad; Tavakkoli, Jahan
2016-08-01
Shear Wave Elastography (SWE) is a quantitative ultrasound-based imaging modality for distinguishing normal and abnormal tissue types by estimating the local viscoelastic properties of the tissue. These properties have been estimated in many studies by propagating ultrasound shear wave within the tissue and estimating parameters such as speed of wave. Vast majority of the proposed techniques are based on the cross-correlation of consecutive ultrasound images. In this study, we propose a new method of wave detection based on time-frequency (TF) analysis of the ultrasound signal. The proposed method is a modified version of the Wigner-Ville Distribution (WVD) technique. The TF components of the wave are detected in a propagating ultrasound wave within a simulated multilayer tissue and the local properties are estimated based on the detected waves. Image processing techniques such as Alternative Sequential Filters (ASF) and Circular Hough Transform (CHT) have been utilized to improve the estimation of TF components. This method has been applied to a simulated data from Wave3000™ software (CyberLogic Inc., New York, NY). This data simulates the propagation of an acoustic radiation force impulse within a two-layer tissue with slightly different viscoelastic properties between the layers. By analyzing the local TF components of the wave, we estimate the longitudinal and shear elasticities and viscosities of the media. This work shows that our proposed method is capable of distinguishing between different layers of a tissue.
NASA Astrophysics Data System (ADS)
Lee, Youngjin; Lee, Amy Candy; Kim, Hee-Joung
2016-09-01
Recently, significant effort has been spent on the development of photons counting detector (PCD) based on a CdTe for applications in X-ray imaging system. The motivation of developing PCDs is higher image quality. Especially, the K-edge subtraction (KES) imaging technique using a PCD is able to improve image quality and useful for increasing the contrast resolution of a target material by utilizing contrast agent. Based on above-mentioned technique, we presented an idea for an improved K-edge log-subtraction (KELS) imaging technique. The KELS imaging technique based on the PCDs can be realized by using different subtraction energy width of the energy window. In this study, the effects of the KELS imaging technique and subtraction energy width of the energy window was investigated with respect to the contrast, standard deviation, and CNR with a Monte Carlo simulation. We simulated the PCD X-ray imaging system based on a CdTe and polymethylmethacrylate (PMMA) phantom which consists of the various iodine contrast agents. To acquired KELS images, images of the phantom using above and below the iodine contrast agent K-edge absorption energy (33.2 keV) have been acquired at different energy range. According to the results, the contrast and standard deviation were decreased, when subtraction energy width of the energy window is increased. Also, the CNR using a KELS imaging technique is higher than that of the images acquired by using whole energy range. Especially, the maximum differences of CNR between whole energy range and KELS images using a 1, 2, and 3 mm diameter iodine contrast agent were acquired 11.33, 8.73, and 8.29 times, respectively. Additionally, the optimum subtraction energy width of the energy window can be acquired at 5, 4, and 3 keV for the 1, 2, and 3 mm diameter iodine contrast agent, respectively. In conclusion, we successfully established an improved KELS imaging technique and optimized subtraction energy width of the energy window, and based on our results, we recommend using this technique for high image quality.
An interactive driving simulation for driver control and decision-making research
NASA Technical Reports Server (NTRS)
Allen, R. W.; Hogge, J. R.; Schwartz, S. H.
1975-01-01
Display techniques and equations of motion for a relatively simple fixed base car simulation are described. The vehicle dynamics include simplified lateral (steering) and longitudinal (speed) degrees of freedom. Several simulator tasks are described which require a combination of operator control and decision making, including response to wind gust inputs, curved roads, traffic signal lights, and obstacles. Logic circuits are used to detect speeding, running red lights, and crashes. A variety of visual and auditory cues are used to give the driver appropriate performance feedback. The simulated equations of motion are reviewed and the technique for generating the line drawing CRT roadway display is discussed. On-line measurement capabilities and experimenter control features are presented, along with previous and current research results demonstrating simulation capabilities and applications.
Quantum simulation of an ultrathin body field-effect transistor with channel imperfections
NASA Astrophysics Data System (ADS)
Vyurkov, V.; Semenikhin, I.; Filippov, S.; Orlikovsky, A.
2012-04-01
An efficient program for the all-quantum simulation of nanometer field-effect transistors is elaborated. The model is based on the Landauer-Buttiker approach. Our calculation of transmission coefficients employs a transfer-matrix technique involving the arbitrary precision (multiprecision) arithmetic to cope with evanescent modes. Modified in such way, the transfer-matrix technique turns out to be much faster in practical simulations than that of scattering-matrix. Results of the simulation demonstrate the impact of realistic channel imperfections (random charged centers and wall roughness) on transistor characteristics. The Landauer-Buttiker approach is developed to incorporate calculation of the noise at an arbitrary temperature. We also validate the ballistic Landauer-Buttiker approach for the usual situation when heavily doped contacts are indispensably included into the simulation region.
A Novel Approach to Visualizing Dark Matter Simulations.
Kaehler, R; Hahn, O; Abel, T
2012-12-01
In the last decades cosmological N-body dark matter simulations have enabled ab initio studies of the formation of structure in the Universe. Gravity amplified small density fluctuations generated shortly after the Big Bang, leading to the formation of galaxies in the cosmic web. These calculations have led to a growing demand for methods to analyze time-dependent particle based simulations. Rendering methods for such N-body simulation data usually employ some kind of splatting approach via point based rendering primitives and approximate the spatial distributions of physical quantities using kernel interpolation techniques, common in SPH (Smoothed Particle Hydrodynamics)-codes. This paper proposes three GPU-assisted rendering approaches, based on a new, more accurate method to compute the physical densities of dark matter simulation data. It uses full phase-space information to generate a tetrahedral tessellation of the computational domain, with mesh vertices defined by the simulation's dark matter particle positions. Over time the mesh is deformed by gravitational forces, causing the tetrahedral cells to warp and overlap. The new methods are well suited to visualize the cosmic web. In particular they preserve caustics, regions of high density that emerge, when several streams of dark matter particles share the same location in space, indicating the formation of structures like sheets, filaments and halos. We demonstrate the superior image quality of the new approaches in a comparison with three standard rendering techniques for N-body simulation data.
MATLAB Simulation of Gradient-Based Neural Network for Online Matrix Inversion
NASA Astrophysics Data System (ADS)
Zhang, Yunong; Chen, Ke; Ma, Weimu; Li, Xiao-Dong
This paper investigates the simulation of a gradient-based recurrent neural network for online solution of the matrix-inverse problem. Several important techniques are employed as follows to simulate such a neural system. 1) Kronecker product of matrices is introduced to transform a matrix-differential-equation (MDE) to a vector-differential-equation (VDE); i.e., finally, a standard ordinary-differential-equation (ODE) is obtained. 2) MATLAB routine "ode45" is introduced to solve the transformed initial-value ODE problem. 3) In addition to various implementation errors, different kinds of activation functions are simulated to show the characteristics of such a neural network. Simulation results substantiate the theoretical analysis and efficacy of the gradient-based neural network for online constant matrix inversion.
A Fully Distributed Approach to the Design of a KBIT/SEC VHF Packet Radio Network,
1984-02-01
topological change and consequent out-modea routing data. Algorithm development has been aided by computer simulation using a finite state machine technique...development has been aided by computer simulation using a finite state machine technique to model a realistic network of up to fifty nodes. This is...use of computer based equipments in weapons systems and their associated sensors and command and control elements and the trend from voice to data
An analytical method to simulate the H I 21-cm visibility signal for intensity mapping experiments
NASA Astrophysics Data System (ADS)
Sarkar, Anjan Kumar; Bharadwaj, Somnath; Marthi, Visweshwar Ram
2018-01-01
Simulations play a vital role in testing and validating H I 21-cm power spectrum estimation techniques. Conventional methods use techniques like N-body simulations to simulate the sky signal which is then passed through a model of the instrument. This makes it necessary to simulate the H I distribution in a large cosmological volume, and incorporate both the light-cone effect and the telescope's chromatic response. The computational requirements may be particularly large if one wishes to simulate many realizations of the signal. In this paper, we present an analytical method to simulate the H I visibility signal. This is particularly efficient if one wishes to simulate a large number of realizations of the signal. Our method is based on theoretical predictions of the visibility correlation which incorporate both the light-cone effect and the telescope's chromatic response. We have demonstrated this method by applying it to simulate the H I visibility signal for the upcoming Ooty Wide Field Array Phase I.
Maljovec, D.; Liu, S.; Wang, B.; ...
2015-07-14
Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less
Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine.
Lee, S; Lee, J; Lee, A; Park, N; Lee, S; Song, S; Seo, A; Lee, H; Kim, J-I; Eom, K
2013-05-01
Augmented reality (AR) is a technology which enables users to see the real world, with virtual objects superimposed upon or composited with it. AR simulators have been developed and used in human medicine, but not in veterinary medicine. The aim of this study was to develop an AR intravenous (IV) injection simulator to train veterinary and pre-veterinary students to perform canine venipuncture. Computed tomographic (CT) images of a beagle dog were scanned using a 64-channel multidetector. The CT images were transformed into volumetric data sets using an image segmentation method and were converted into a stereolithography format for creating 3D models. An AR-based interface was developed for an AR simulator for IV injection. Veterinary and pre-veterinary student volunteers were randomly assigned to an AR-trained group or a control group trained using more traditional methods (n = 20/group; n = 8 pre-veterinary students and n = 12 veterinary students in each group) and their proficiency at IV injection technique in live dogs was assessed after training was completed. Students were also asked to complete a questionnaire which was administered after using the simulator. The group that was trained using an AR simulator were more proficient at IV injection technique using real dogs than the control group (P ≤ 0.01). The students agreed that they learned the IV injection technique through the AR simulator. Although the system used in this study needs to be modified before it can be adopted for veterinary educational use, AR simulation has been shown to be a very effective tool for training medical personnel. Using the technology reported here, veterinary AR simulators could be developed for future use in veterinary education. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
Mitigating randomness of consumer preferences under certain conditional choices
NASA Astrophysics Data System (ADS)
Bothos, John M. A.; Thanos, Konstantinos-Georgios; Papadopoulou, Eirini; Daveas, Stelios; Thomopoulos, Stelios C. A.
2017-05-01
Agent-based crowd behaviour consists a significant field of research that has drawn a lot of attention in recent years. Agent-based crowd simulation techniques have been used excessively to forecast the behaviour of larger or smaller crowds in terms of certain given conditions influenced by specific cognition models and behavioural rules and norms, imposed from the beginning. Our research employs conditional event algebra, statistical methodology and agent-based crowd simulation techniques in developing a behavioural econometric model about the selection of certain economic behaviour by a consumer that faces a spectre of potential choices when moving and acting in a multiplex mall. More specifically we try to analyse the influence of demographic, economic, social and cultural factors on the economic behaviour of a certain individual and then we try to link its behaviour with the general behaviour of the crowds of consumers in multiplex malls using agent-based crowd simulation techniques. We then run our model using Generalized Least Squares and Maximum Likelihood methods to come up with the most probable forecast estimations, regarding the agent's behaviour. Our model is indicative about the formation of consumers' spectre of choices in multiplex malls under the condition of predefined preferences and can be used as a guide for further research in this area.
Review of the systems biology of the immune system using agent-based models.
Shinde, Snehal B; Kurhekar, Manish P
2018-06-01
The immune system is an inherent protection system in vertebrate animals including human beings that exhibit properties such as self-organisation, self-adaptation, learning, and recognition. It interacts with the other allied systems such as the gut and lymph nodes. There is a need for immune system modelling to know about its complex internal mechanism, to understand how it maintains the homoeostasis, and how it interacts with the other systems. There are two types of modelling techniques used for the simulation of features of the immune system: equation-based modelling (EBM) and agent-based modelling. Owing to certain shortcomings of the EBM, agent-based modelling techniques are being widely used. This technique provides various predictions for disease causes and treatments; it also helps in hypothesis verification. This study presents a review of agent-based modelling of the immune system and its interactions with the gut and lymph nodes. The authors also review the modelling of immune system interactions during tuberculosis and cancer. In addition, they also outline the future research directions for the immune system simulation through agent-based techniques such as the effects of stress on the immune system, evolution of the immune system, and identification of the parameters for a healthy immune system.
Extending the Distributed Lag Model framework to handle chemical mixtures.
Bello, Ghalib A; Arora, Manish; Austin, Christine; Horton, Megan K; Wright, Robert O; Gennings, Chris
2017-07-01
Distributed Lag Models (DLMs) are used in environmental health studies to analyze the time-delayed effect of an exposure on an outcome of interest. Given the increasing need for analytical tools for evaluation of the effects of exposure to multi-pollutant mixtures, this study attempts to extend the classical DLM framework to accommodate and evaluate multiple longitudinally observed exposures. We introduce 2 techniques for quantifying the time-varying mixture effect of multiple exposures on an outcome of interest. Lagged WQS, the first technique, is based on Weighted Quantile Sum (WQS) regression, a penalized regression method that estimates mixture effects using a weighted index. We also introduce Tree-based DLMs, a nonparametric alternative for assessment of lagged mixture effects. This technique is based on the Random Forest (RF) algorithm, a nonparametric, tree-based estimation technique that has shown excellent performance in a wide variety of domains. In a simulation study, we tested the feasibility of these techniques and evaluated their performance in comparison to standard methodology. Both methods exhibited relatively robust performance, accurately capturing pre-defined non-linear functional relationships in different simulation settings. Further, we applied these techniques to data on perinatal exposure to environmental metal toxicants, with the goal of evaluating the effects of exposure on neurodevelopment. Our methods identified critical neurodevelopmental windows showing significant sensitivity to metal mixtures. Copyright © 2017 Elsevier Inc. All rights reserved.
Modelling Technique for Demonstrating Gravity Collapse Structures in Jointed Rock.
ERIC Educational Resources Information Center
Stimpson, B.
1979-01-01
Described is a base-friction modeling technique for studying the development of collapse structures in jointed rocks. A moving belt beneath weak material is designed to simulate gravity. A description is given of the model frame construction. (Author/SA)
Experiment T002: Manual navigation sightings
NASA Technical Reports Server (NTRS)
Smith, D.
1971-01-01
Navigation-type measurements through the window of the stabilized Gemini 12 spacecraft by the use of a hand-held sextant are reported. The major objectives were as follows: (1) to evaluate the ability of the crewmen to make accurate navigational measurements by the use of simple instruments in an authentic space flight environment; (2) to evaluate the operational feasibility of the measurement techniques by the use of the pressure suit with the helmet off and with the helmet on and the visor closed; (3) to evaluate operational problems associated with the spacecraft environment; and (4) to validate ground based simulation techniques by comparison of the inflight results with base line data obtained by the pilot by the use of simulators and celestial targets from ground based observatories.
NASA Astrophysics Data System (ADS)
Zhang, Rong-Hua; Tao, Ling-Jiang; Gao, Chuan
2017-09-01
Large uncertainties exist in real-time predictions of the 2015 El Niño event, which have systematic intensity biases that are strongly model-dependent. It is critically important to characterize those model biases so they can be reduced appropriately. In this study, the conditional nonlinear optimal perturbation (CNOP)-based approach was applied to an intermediate coupled model (ICM) equipped with a four-dimensional variational data assimilation technique. The CNOP-based approach was used to quantify prediction errors that can be attributed to initial conditions (ICs) and model parameters (MPs). Two key MPs were considered in the ICM: one represents the intensity of the thermocline effect, and the other represents the relative coupling intensity between the ocean and atmosphere. Two experiments were performed to illustrate the effects of error corrections, one with a standard simulation and another with an optimized simulation in which errors in the ICs and MPs derived from the CNOP-based approach were optimally corrected. The results indicate that simulations of the 2015 El Niño event can be effectively improved by using CNOP-derived error correcting. In particular, the El Niño intensity in late 2015 was adequately captured when simulations were started from early 2015. Quantitatively, the Niño3.4 SST index simulated in Dec. 2015 increased to 2.8 °C in the optimized simulation, compared with only 1.5 °C in the standard simulation. The feasibility and effectiveness of using the CNOP-based technique to improve ENSO simulations are demonstrated in the context of the 2015 El Niño event. The limitations and further applications are also discussed.
Spectrum simulation in DTSA-II.
Ritchie, Nicholas W M
2009-10-01
Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.
Genetic Algorithms and Their Application to the Protein Folding Problem
1993-12-01
and symbolic methods, random methods such as Monte Carlo simulation and simulated annealing, distance geometry, and molecular dynamics. Many of these...calculated energies with those obtained using the molecular simulation software package called CHARMm. 10 9) Test both the simple and parallel simpie genetic...homology-based, and simplification techniques. 3.21 Molecular Dynamics. Perhaps the most natural approach is to actually simulate the folding process. This
Membrane Insertion Profiles of Peptides Probed by Molecular Dynamics Simulations
2008-07-17
Membrane insertion profiles of peptides probed by molecular dynamics simulations In-Chul Yeh,* Mark A. Olson,# Michael S. Lee,*#§ and Anders...a methodology based on molecular dynamics simulation techniques to probe the insertion profiles of small peptides across the membrane interface. The...profiles of peptides probed by molecular dynamics simulations 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d
Scientific Visualization and Simulation for Multi-dimensional Marine Environment Data
NASA Astrophysics Data System (ADS)
Su, T.; Liu, H.; Wang, W.; Song, Z.; Jia, Z.
2017-12-01
As higher attention on the ocean and rapid development of marine detection, there are increasingly demands for realistic simulation and interactive visualization of marine environment in real time. Based on advanced technology such as GPU rendering, CUDA parallel computing and rapid grid oriented strategy, a series of efficient and high-quality visualization methods, which can deal with large-scale and multi-dimensional marine data in different environmental circumstances, has been proposed in this paper. Firstly, a high-quality seawater simulation is realized by FFT algorithm, bump mapping and texture animation technology. Secondly, large-scale multi-dimensional marine hydrological environmental data is virtualized by 3d interactive technologies and volume rendering techniques. Thirdly, seabed terrain data is simulated with improved Delaunay algorithm, surface reconstruction algorithm, dynamic LOD algorithm and GPU programming techniques. Fourthly, seamless modelling in real time for both ocean and land based on digital globe is achieved by the WebGL technique to meet the requirement of web-based application. The experiments suggest that these methods can not only have a satisfying marine environment simulation effect, but also meet the rendering requirements of global multi-dimension marine data. Additionally, a simulation system for underwater oil spill is established by OSG 3D-rendering engine. It is integrated with the marine visualization method mentioned above, which shows movement processes, physical parameters, current velocity and direction for different types of deep water oil spill particle (oil spill particles, hydrates particles, gas particles, etc.) dynamically and simultaneously in multi-dimension. With such application, valuable reference and decision-making information can be provided for understanding the progress of oil spill in deep water, which is helpful for ocean disaster forecasting, warning and emergency response.
Signal Processing Studies of a Simulated Laser Doppler Velocimetry-Based Acoustic Sensor
1990-10-17
investigated using spectral correlation methods. Results indicate that it may be possible to extend demonstrated LDV-based acoustic sensor sensitivities using higher order processing techniques. (Author)
Huang, Hsuan-Ming; Hsiao, Ing-Tsung
2017-01-01
Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.
Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems
NASA Technical Reports Server (NTRS)
Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris
2010-01-01
Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
NASA Astrophysics Data System (ADS)
Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan
2004-05-01
We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.
An electrical circuit model for simulation of indoor radon concentration.
Musavi Nasab, S M; Negarestani, A
2013-01-01
In this study, a new model based on electric circuit theory was introduced to simulate the behaviour of indoor radon concentration. In this model, a voltage source simulates radon generation in walls, conductivity simulates migration through walls and voltage across a capacitor simulates radon concentration in a room. This simulation considers migration of radon through walls by diffusion mechanism in one-dimensional geometry. Data reported in a typical Greek house were employed to examine the application of this technique of simulation to the behaviour of radon.
NASA Astrophysics Data System (ADS)
Karimabadi, Homa
2012-03-01
Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.
NASA Astrophysics Data System (ADS)
Kreis, Karsten; Kremer, Kurt; Potestio, Raffaello; Tuckerman, Mark E.
2017-12-01
Path integral-based methodologies play a crucial role for the investigation of nuclear quantum effects by means of computer simulations. However, these techniques are significantly more demanding than corresponding classical simulations. To reduce this numerical effort, we recently proposed a method, based on a rigorous Hamiltonian formulation, which restricts the quantum modeling to a small but relevant spatial region within a larger reservoir where particles are treated classically. In this work, we extend this idea and show how it can be implemented along with state-of-the-art path integral simulation techniques, including path-integral molecular dynamics, which allows for the calculation of quantum statistical properties, and ring-polymer and centroid molecular dynamics, which allow the calculation of approximate quantum dynamical properties. To this end, we derive a new integration algorithm that also makes use of multiple time-stepping. The scheme is validated via adaptive classical-path-integral simulations of liquid water. Potential applications of the proposed multiresolution method are diverse and include efficient quantum simulations of interfaces as well as complex biomolecular systems such as membranes and proteins.
An analysis of airline landing flare data based on flight and training simulator measurements
NASA Technical Reports Server (NTRS)
Heffley, R. K.; Schulman, T. M.; Clement, T. M.
1982-01-01
Landings by experienced airline pilots transitioning to the DC-10, performed in flight and on a simulator, were analyzed and compared using a pilot-in-the-loop model of the landing maneuver. By solving for the effective feedback gains and pilot compensation which described landing technique, it was possible to discern fundamental differences in pilot behavior between the actual aircraft and the simulator. These differences were then used to infer simulator fidelity in terms of specific deficiencies and to quantify the effectiveness of training on the simulator as compared to training in flight. While training on the simulator, pilots exhibited larger effective lag in commanding the flare. The inability to compensate adequately for this lag was associated with hard or inconsistent landings. To some degree this deficiency was carried into flight, thus resulting in a slightly different and inferior landing technique than exhibited by pilots trained exclusively on the actual aircraft.
NVSIM: UNIX-based thermal imaging system simulator
NASA Astrophysics Data System (ADS)
Horger, John D.
1993-08-01
For several years the Night Vision and Electronic Sensors Directorate (NVESD) has been using an internally developed forward looking infrared (FLIR) simulation program. In response to interest in the simulation part of these projects by other organizations, NVESD has been working on a new version of the simulation, NVSIM, that will be made generally available to the FLIR using community. NVSIM uses basic FLIR specification data, high resolution thermal input imagery and spatial domain image processing techniques to produce simulated image outputs from a broad variety of FLIRs. It is being built around modular programming techniques to allow simpler addition of more sensor effects. The modularity also allows selective inclusion and exclusion of individual sensor effects at run time. The simulation has been written in the industry standard ANSI C programming language under the widely used UNIX operating system to make it easily portable to a wide variety of computer platforms.
Enríquez, Diego; Lamborizio, María J; Firenze, Lorena; Jaureguizar, María de la P; Díaz Pumará, Estanislao; Szyld, Edgardo
2017-08-01
To evaluate the performance of resident physicians in diagnosing and treating a case of anaphylaxis, six months after participating in simulation training exercises. Initially, a group of pediatric residents were trained using simulation techniques in the management of critical pediatric cases. Based on their performance in this exercise, participants were assigned to one of 3 groups. At six months post-training, 4 residents were randomly chosen from each group to be re-tested, using the same performance measure as previously used. During the initial training session, 56 of 72 participants (78%) correctly identified and treated the case. Six months after the initial training, all 12 (100%) resident physicians who were re-tested successfully diagnosed and treated the simulated anaphylaxis case. The training through simulation techniques allowed correction or optimization of the treatment of simulated anaphylaxis cases in resident physicians evaluated after 6 months of the initial training.
Understanding Islamist political violence through computational social simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G
Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less
Methods and measurements in real-time air traffic control system simulation.
DOT National Transportation Integrated Search
1983-04-01
The major purpose of this work was to asses dynamic simulation of air traffic control systems as a technique for evaluating such systems in a statistically sound and objective manner. A large set of customarily used measures based on the system missi...
2012-06-15
pp. 535-543. [17] Compere , M., Goodell, J., Simon, M., Smith, W., and Brudnak, M., 2006, "Robust Control Techniques Enabling Duty Cycle...Technical Paper, 2006-01-3077. [18] Goodell, J., Compere , M., Simon, M., Smith, W., Wright, R., and Brudnak, M., 2006, "Robust Control Techniques for...Smith, W., Compere , M., Goodell, J., Holtz, D., Mortsfield, T., and Shvartsman, A., 2007, "Soldier/Harware-in-the-Loop Simulation- Based Combat Vehicle
NASA and CFD - Making investments for the future
NASA Technical Reports Server (NTRS)
Hessenius, Kristin A.; Richardson, P. F.
1992-01-01
From a NASA perspective, CFD is a new tool for fluid flow simulation and prediction with virtually none of the inherent limitations of other ground-based simulation techniques. A primary goal of NASA's CFD research program is to develop efficient and accurate computational techniques for utilization in the design and analysis of aerospace vehicles. The program in algorithm development has systematically progressed through the hierarchy of engineering simplifications of the Navier-Stokes equations, starting with the inviscid formulations such as transonic small disturbance, full potential, and Euler.
New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Lung, Shun-Fat
2017-01-01
A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.
NASA Astrophysics Data System (ADS)
Yamana, Teresa K.; Eltahir, Elfatih A. B.
2011-02-01
This paper describes the use of satellite-based estimates of rainfall to force the Hydrology, Entomology and Malaria Transmission Simulator (HYDREMATS), a hydrology-based mechanistic model of malaria transmission. We first examined the temporal resolution of rainfall input required by HYDREMATS. Simulations conducted over Banizoumbou village in Niger showed that for reasonably accurate simulation of mosquito populations, the model requires rainfall data with at least 1 h resolution. We then investigated whether HYDREMATS could be effectively forced by satellite-based estimates of rainfall instead of ground-based observations. The Climate Prediction Center morphing technique (CMORPH) precipitation estimates distributed by the National Oceanic and Atmospheric Administration are available at a 30 min temporal resolution and 8 km spatial resolution. We compared mosquito populations simulated by HYDREMATS when the model is forced by adjusted CMORPH estimates and by ground observations. The results demonstrate that adjusted rainfall estimates from satellites can be used with a mechanistic model to accurately simulate the dynamics of mosquito populations.
Simulation-Based Valuation of Transactive Energy Systems
Huang, Qiuhua; McDermott, Tom; Tang, Yingying; ...
2018-05-18
Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less
Simulation-Based Valuation of Transactive Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Qiuhua; McDermott, Tom; Tang, Yingying
Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less
EMU Suit Performance Simulation
NASA Technical Reports Server (NTRS)
Cowley, Matthew S.; Benson, Elizabeth; Harvill, Lauren; Rajulu, Sudhakar
2014-01-01
Introduction: Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. To verify that new suit designs meet requirements, full prototypes must be built and tested with human subjects. However, numerous design iterations will occur before the hardware meets those requirements. Traditional draw-prototype-test paradigms for research and development are prohibitively expensive with today's shrinking Government budgets. Personnel at NASA are developing modern simulation techniques that focus on a human-centric design paradigm. These new techniques make use of virtual prototype simulations and fully adjustable physical prototypes of suit hardware. This is extremely advantageous and enables comprehensive design down-selections to be made early in the design process. Objectives: The primary objective was to test modern simulation techniques for evaluating the human performance component of two EMU suit concepts, pivoted and planar style hard upper torso (HUT). Methods: This project simulated variations in EVA suit shoulder joint design and subject anthropometry and then measured the differences in shoulder mobility caused by the modifications. These estimations were compared to human-in-the-loop test data gathered during past suited testing using four subjects (two large males, two small females). Results: Results demonstrated that EVA suit modeling and simulation are feasible design tools for evaluating and optimizing suit design based on simulated performance. The suit simulation model was found to be advantageous in its ability to visually represent complex motions and volumetric reach zones in three dimensions, giving designers a faster and deeper comprehension of suit component performance vs. human performance. Suit models were able to discern differing movement capabilities between EMU HUT configurations, generic suit fit concerns, and specific suit fit concerns for crewmembers based on individual anthropometry
Agent-Based Simulations for Project Management
NASA Technical Reports Server (NTRS)
White, J. Chris; Sholtes, Robert M.
2011-01-01
Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.
NASA Astrophysics Data System (ADS)
Deng, Bo; Shi, Yaoyao
2017-11-01
The tape winding technology is an effective way to fabricate rotationally composite materials. Nevertheless, some inevitable defects will seriously influence the performance of winding products. One of the crucial ways to identify the quality of fiber-reinforced composite material products is examining its void content. Significant improvement in products' mechanical properties can be achieved by minimizing the void defect. Two methods were applied in this study, finite element analysis and experimental testing, respectively, to investigate the mechanism of how void forming in composite tape winding processing. Based on the theories of interlayer intimate contact and Domain Superposition Technique (DST), a three-dimensional model of prepreg tape void with SolidWorks has been modeled in this paper. Whereafter, ABAQUS simulation software was used to simulate the void content change with pressure and temperature. Finally, a series of experiments were performed to determine the accuracy of the model-based predictions. The results showed that the model is effective for predicting the void content in the composite tape winding process.
Nonlinear dynamic macromodeling techniques for audio systems
NASA Astrophysics Data System (ADS)
Ogrodzki, Jan; Bieńkowski, Piotr
2015-09-01
This paper develops a modelling method and a models identification technique for the nonlinear dynamic audio systems. Identification is performed by means of a behavioral approach based on a polynomial approximation. This approach makes use of Discrete Fourier Transform and Harmonic Balance Method. A model of an audio system is first created and identified and then it is simulated in real time using an algorithm of low computational complexity. The algorithm consists in real time emulation of the system response rather than in simulation of the system itself. The proposed software is written in Python language using object oriented programming techniques. The code is optimized for a multithreads environment.
A Technique for Measuring Rotocraft Dynamic Stability in the 40 by 80 Foot Wind Tunnel
NASA Technical Reports Server (NTRS)
Gupta, N. K.; Bohn, J. G.
1977-01-01
An on-line technique is described for the measurement of tilt rotor aircraft dynamic stability in the Ames 40- by 80-Foot Wind Tunnel. The technique is based on advanced system identification methodology and uses the instrumental variables approach. It is particulary applicable to real time estimation problems with limited amounts of noise-contaminated data. Several simulations are used to evaluate the algorithm. Estimated natural frequencies and damping ratios are compared with simulation values. The algorithm is also applied to wind tunnel data in an off-line mode. The results are used to develop preliminary guidelines for effective use of the algorithm.
Machine learning for autonomous crystal structure identification.
Reinhart, Wesley F; Long, Andrew W; Howard, Michael P; Ferguson, Andrew L; Panagiotopoulos, Athanassios Z
2017-07-21
We present a machine learning technique to discover and distinguish relevant ordered structures from molecular simulation snapshots or particle tracking data. Unlike other popular methods for structural identification, our technique requires no a priori description of the target structures. Instead, we use nonlinear manifold learning to infer structural relationships between particles according to the topology of their local environment. This graph-based approach yields unbiased structural information which allows us to quantify the crystalline character of particles near defects, grain boundaries, and interfaces. We demonstrate the method by classifying particles in a simulation of colloidal crystallization, and show that our method identifies structural features that are missed by standard techniques.
Opto-electronic characterization of third-generation solar cells.
Neukom, Martin; Züfle, Simon; Jenatsch, Sandra; Ruhstaller, Beat
2018-01-01
We present an overview of opto-electronic characterization techniques for solar cells including light-induced charge extraction by linearly increasing voltage, impedance spectroscopy, transient photovoltage, charge extraction and more. Guidelines for the interpretation of experimental results are derived based on charge drift-diffusion simulations of solar cells with common performance limitations. It is investigated how nonidealities like charge injection barriers, traps and low mobilities among others manifest themselves in each of the studied cell characterization techniques. Moreover, comprehensive parameter extraction for an organic bulk-heterojunction solar cell comprising PCDTBT:PC 70 BM is demonstrated. The simulations reproduce measured results of 9 different experimental techniques. Parameter correlation is minimized due to the combination of various techniques. Thereby a route to comprehensive and accurate parameter extraction is identified.
Feasibility of track-based multiple scattering tomography
NASA Astrophysics Data System (ADS)
Jansen, H.; Schütze, P.
2018-04-01
We present a tomographic technique making use of a gigaelectronvolt electron beam for the determination of the material budget distribution of centimeter-sized objects by means of simulations and measurements. In both cases, the trajectory of electrons traversing a sample under test is reconstructed using a pixel beam-telescope. The width of the deflection angle distribution of electrons undergoing multiple Coulomb scattering at the sample is estimated. Basing the sinogram on position-resolved estimators enables the reconstruction of the original sample using an inverse radon transform. We exemplify the feasibility of this tomographic technique via simulations of two structured cubes—made of aluminium and lead—and via an in-beam measured coaxial adapter. The simulations yield images with FWHM edge resolutions of (177 ± 13) μm and a contrast-to-noise ratio of 5.6 ± 0.2 (7.8 ± 0.3) for aluminium (lead) compared to air. The tomographic reconstruction of a coaxial adapter serves as experimental evidence of the technique and yields a contrast-to-noise ratio of 15.3 ± 1.0 and a FWHM edge resolution of (117 ± 4) μm.
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-12
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
NASA Astrophysics Data System (ADS)
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-01
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
From Simulation to Real Robots with Predictable Results: Methods and Examples
NASA Astrophysics Data System (ADS)
Balakirsky, S.; Carpin, S.; Dimitoglou, G.; Balaguer, B.
From a theoretical perspective, one may easily argue (as we will in this chapter) that simulation accelerates the algorithm development cycle. However, in practice many in the robotics development community share the sentiment that “Simulation is doomed to succeed” (Brooks, R., Matarić, M., Robot Learning, Kluwer Academic Press, Hingham, MA, 1993, p. 209). This comes in large part from the fact that many simulation systems are brittle; they do a fair-to-good job of simulating the expected, and fail to simulate the unexpected. It is the authors' belief that a simulation system is only as good as its models, and that deficiencies in these models lead to the majority of these failures. This chapter will attempt to address these deficiencies by presenting a systematic methodology with examples for the development of both simulated mobility models and sensor models for use with one of today's leading simulation engines. Techniques for using simulation for algorithm development leading to real-robot implementation will be presented, as well as opportunities for involvement in international robotics competitions based on these techniques.
A Method for Generating Reduced-Order Linear Models of Multidimensional Supersonic Inlets
NASA Technical Reports Server (NTRS)
Chicatelli, Amy; Hartley, Tom T.
1998-01-01
Simulation of high speed propulsion systems may be divided into two categories, nonlinear and linear. The nonlinear simulations are usually based on multidimensional computational fluid dynamics (CFD) methodologies and tend to provide high resolution results that show the fine detail of the flow. Consequently, these simulations are large, numerically intensive, and run much slower than real-time. ne linear simulations are usually based on large lumping techniques that are linearized about a steady-state operating condition. These simplistic models often run at or near real-time but do not always capture the detailed dynamics of the plant. Under a grant sponsored by the NASA Lewis Research Center, Cleveland, Ohio, a new method has been developed that can be used to generate improved linear models for control design from multidimensional steady-state CFD results. This CFD-based linear modeling technique provides a small perturbation model that can be used for control applications and real-time simulations. It is important to note the utility of the modeling procedure; all that is needed to obtain a linear model of the propulsion system is the geometry and steady-state operating conditions from a multidimensional CFD simulation or experiment. This research represents a beginning step in establishing a bridge between the controls discipline and the CFD discipline so that the control engineer is able to effectively use multidimensional CFD results in control system design and analysis.
Simulation of Thermographic Responses of Delaminations in Composites with Quadrupole Method
NASA Technical Reports Server (NTRS)
Winfree, William P.; Zalameda, Joseph N.; Howell, Patricia A.; Cramer, K. Elliott
2016-01-01
The application of the quadrupole method for simulating thermal responses of delaminations in carbon fiber reinforced epoxy composites materials is presented. The method solves for the flux at the interface containing the delamination. From the interface flux, the temperature at the surface is calculated. While the results presented are for single sided measurements, with ash heating, expansion of the technique to arbitrary temporal flux heating or through transmission measurements is simple. The quadrupole method is shown to have two distinct advantages relative to finite element or finite difference techniques. First, it is straight forward to incorporate arbitrary shaped delaminations into the simulation. Second, the quadrupole method enables calculation of the thermal response at only the times of interest. This, combined with a significant reduction in the number of degrees of freedom for the same simulation quality, results in a reduction of the computation time by at least an order of magnitude. Therefore, it is a more viable technique for model based inversion of thermographic data. Results for simulations of delaminations in composites are presented and compared to measurements and finite element method results.
Husak, Gregory J.; Michaelsen, Joel; Kyriakidis, P.; Verdin, James P.; Funk, Chris; Galu, Gideon
2011-01-01
Probabilistic forecasts are produced from a variety of outlets to help predict rainfall, and other meteorological events, for periods of 1 month or more. Such forecasts are expressed as probabilities of a rainfall event, e.g. being in the upper, middle, or lower third of the relevant distribution of rainfall in the region. The impact of these forecasts on the expectation for the event is not always clear or easily conveyed. This article proposes a technique based on Monte Carlo simulation for adjusting existing climatologic statistical parameters to match forecast information, resulting in new parameters defining the probability of events for the forecast interval. The resulting parameters are shown to approximate the forecasts with reasonable accuracy. To show the value of the technique as an application for seasonal rainfall, it is used with consensus forecast developed for the Greater Horn of Africa for the 2009 March-April-May season. An alternative, analytical approach is also proposed, and discussed in comparison to the first simulation-based technique.
Yaguchi, A; Nagase, K; Ishikawa, M; Iwasaka, T; Odagaki, M; Hosaka, H
2006-01-01
Computer simulation and myocardial cell models were used to evaluate a low-energy defibrillation technique. A generated spiral wave, considered to be a mechanism of fibrillation, and fibrillation were investigated using two myocardial sheet models: a two-dimensional computer simulation model and a two-dimensional experimental model. A new defibrillation technique that has few side effects, which are induced by the current passing into the patient's body, on cardiac muscle is desired. The purpose of the present study is to conduct a basic investigation into an efficient defibrillation method. In order to evaluate the defibrillation method, the propagation of excitation in the myocardial sheet is measured during the normal state and during fibrillation, respectively. The advantages of the low-energy defibrillation technique are then discussed based on the stimulation timing.
Improving Simulated Annealing by Recasting it as a Non-Cooperative Game
NASA Technical Reports Server (NTRS)
Wolpert, David; Bandari, Esfandiar; Tumer, Kagan
2001-01-01
The game-theoretic field of COllective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved "as a side-effect". Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed game-theory-motivated algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting improves simulated annealing by several orders of magnitude for spin glass relaxation and bin-packing.
Detection of chemical warfare simulants using Raman excitation at 1064 nm
NASA Astrophysics Data System (ADS)
Dentinger, Claire; Mabry, Mark W.; Roy, Eric G.
2014-05-01
Raman spectroscopy is a powerful technique for material identification. The technique is sensitive to primary and higher ordered molecular structure and can be used to identify unknown materials by comparison with spectral reference libraries. Additionally, miniaturization of opto-electronic components has permitted development of portable Raman analyzers that are field deployable. Raman scattering is a relatively weak effect compared to a competing phenomenon, fluorescence. Even a moderate amount of fluorescence background interference can easily prevent identification of unknown materials. A long wavelength Raman system is less likely to induce fluorescence from a wider variety of materials than a higher energy visible laser system. Compounds such as methyl salicylate (MS), diethyl malonate (DEM), and dimethyl methylphosphonate (DMMP) are used as chemical warfare agent (CWA) simulants for development of analytical detection strategies. Field detection of these simulants however poses unique challenges because threat identification must be made quickly without the turnaround time usually required for a laboratory based analysis. Fortunately, these CWA simulants are good Raman scatterers, and field based detection using portable Raman instruments is promising. Measurements of the CWA simulants were done using a 1064 nm based portable Raman spectrometer. The longer wavelength excitation laser was chosen relative to a visible based laser systems because the 1064 nm based spectrometer is less likely to induce fluorescence and more suitable to a wider range of materials. To more closely mimic real world measurement situations, different sample presentations were investigated.
Image based SAR product simulation for analysis
NASA Technical Reports Server (NTRS)
Domik, G.; Leberl, F.
1987-01-01
SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.
Assessment of simulation fidelity using measurements of piloting technique in flight
NASA Technical Reports Server (NTRS)
Ferguson, S. W.; Clement, W. F.; Cleveland, W. B.; Key, D. L.
1984-01-01
The U.S. Army and NASA have undertaken the systematic validation of a ground-based piloted simulator for the UH-60A helicopter. The results of previous handling quality and task performance flight tests for this helicopter have been used as a data base for evaluating the fidelity of the present simulation, which is being conducted at the NASA Ames Research Center's Vertical Motion Simulator. Such nap-of-the-earth piloting tasks as pop-up, hover turn, dash/quick stop, sidestep, dolphin, and slalom, have been investigated. It is noted that pilot simulator performance is significantly and quantifiable degraded by comparison with flight test results for the same tasks.
Ishikawa, Shun; Okamoto, Shogo; Isogai, Kaoru; Akiyama, Yasuhiro; Yanagihara, Naomi; Yamada, Yoji
2015-01-01
Robots that simulate patients suffering from joint resistance caused by biomechanical and neural impairments are used to aid the training of physical therapists in manual examination techniques. However, there are few methods for assessing such robots. This article proposes two types of assessment measures based on typical judgments of clinicians. One of the measures involves the evaluation of how well the simulator presents different severities of a specified disease. Experienced clinicians were requested to rate the simulated symptoms in terms of severity, and the consistency of their ratings was used as a performance measure. The other measure involves the evaluation of how well the simulator presents different types of symptoms. In this case, the clinicians were requested to classify the simulated resistances in terms of symptom type, and the average ratios of their answers were used as performance measures. For both types of assessment measures, a higher index implied higher agreement among the experienced clinicians that subjectively assessed the symptoms based on typical symptom features. We applied these two assessment methods to a patient knee robot and achieved positive appraisals. The assessment measures have potential for use in comparing several patient simulators for training physical therapists, rather than as absolute indices for developing a standard. PMID:25923719
NASA Technical Reports Server (NTRS)
Williams, D. H.
1983-01-01
A simulation study was undertaken to evaluate two time-based self-spacing techniques for in-trail following during terminal area approach. An electronic traffic display was provided in the weather radarscope location. The displayed self-spacing cues allowed the simulated aircraft to follow and to maintain spacing on another aircraft which was being vectored by air traffic control (ATC) for landing in a high-density terminal area. Separation performance data indicate the information provided on the traffic display was adequate for the test subjects to accurately follow the approach path of another aircraft without the assistance of ATC. The time-based technique with a constant-delay spacing criterion produced the most satisfactory spacing performance. Pilot comments indicate the workload associated with the self-separation task was very high and that additional spacing command information and/or aircraft autopilot functions would be desirable for operational implementational of the self-spacing task.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mian, Muhammad Umer, E-mail: umermian@gmail.com; Khir, M. H. Md.; Tang, T. B.
Pre-fabrication, behavioural and performance analysis with computer aided design (CAD) tools is a common and fabrication cost effective practice. In light of this we present a simulation methodology for a dual-mass oscillator based 3 Degree of Freedom (3-DoF) MEMS gyroscope. 3-DoF Gyroscope is modeled through lumped parameter models using equivalent circuit elements. These equivalent circuits consist of elementary components which are counterpart of their respective mechanical components, used to design and fabricate 3-DoF MEMS gyroscope. Complete designing of equivalent circuit model, mathematical modeling and simulation are being presented in this paper. Behaviors of the equivalent lumped models derived for themore » proposed device design are simulated in MEMSPRO T-SPICE software. Simulations are carried out with the design specifications following design rules of the MetalMUMPS fabrication process. Drive mass resonant frequencies simulated by this technique are 1.59 kHz and 2.05 kHz respectively, which are close to the resonant frequencies found by the analytical formulation of the gyroscope. The lumped equivalent circuit modeling technique proved to be a time efficient modeling technique for the analysis of complex MEMS devices like 3-DoF gyroscopes. The technique proves to be an alternative approach to the complex and time consuming couple field analysis Finite Element Analysis (FEA) previously used.« less
Tercero, C.; Ikeda, S.; Ooe, K.; Fukuda, T.; Arai, F.; Negoro, M.; Takahashi, I.; Kwon, G.
2012-01-01
Summary In the domain of endovascular neurosurgery, the measurement of tissue integrity is needed for simulator-based training and for the development of new intravascular instruments and treatment techniques. In vitro evaluation of tissue manipulation can be achieved using photoelastic stress analysis and vasculature modeling with photoelastic materials. In this research we constructed two types of vasculature models of saccular aneurysms for differentiation of embolization techniques according to the respect for tissue integrity measurements based on the stress within the blood vessel model wall. In an aneurysm model with 5 mm dome diameter, embolization using MicroPlex 10 (Complex 1D, with 4 mm diameter loops), a maximum area of 3.97 mm2 with stress above 1 kPa was measured. This area increased to 5.50 mm2 when the dome was touched deliberately with the release mechanism of the coil, and to 4.87 mm2 for an embolization using Micrusphere, (Spherical 18 Platinum Coil). In a similar way trans-cell stent-assisted coil embolization was also compared to human blood pressure simulation using a model of a wide-necked saccular aneurysm with 7 mm diameter. The area with stress above 1kPa was below 1 mm2 for the pressure simulation and maximized at 3.79 mm2 during the trans-cell insertion of the micro-catheter and at 8.92 mm2 during the embolization. The presented results show that this measurement system is useful for identifying techniques compromising tissue integrity, comparing and studying coils and embolization techniques for a specific vasculature morphology and comparing their natural stress variations such as that produced by blood pressure. PMID:23217635
Brydges, Ryan; Hatala, Rose; Mylopoulos, Maria
2016-07-01
Simulation-based training is currently embedded in most health professions education curricula. Without evidence for how trainees think about their simulation-based learning, some training techniques may not support trainees' learning strategies. This study explored how residents think about and self-regulate learning during a lumbar puncture (LP) training session using a simulator. In 2010, 20 of 45 postgraduate year 1 internal medicine residents attended a mandatory procedural skills training boot camp. Independently, residents practiced the entire LP skill on a part-task trainer using a clinical LP tray and proper sterile technique. We interviewed participants regarding how they thought about and monitored their learning processes, and then we conducted a thematic analysis of the interview data. The analysis suggested that participants considered what they could and could not learn from the simulator; they developed their self-confidence by familiarizing themselves with the LP equipment and repeating the LP algorithmic steps. Participants articulated an idiosyncratic model of learning they used to interpret the challenges and successes they experienced. Participants reported focusing on obtaining cerebrospinal fluid and memorizing the "routine" version of the LP procedure. They did not report much thinking about their learning strategies (eg, self-questioning). During simulation-based training, residents described assigning greater weight to achieving procedural outcomes and tended to think that the simulated task provided them with routine, generalizable skills. Over this typical 1-hour session, trainees did not appear to consider their strategic mindfulness (ie, awareness and use of learning strategies).
NASA Technical Reports Server (NTRS)
Brooks, D. R.
1980-01-01
Orbit dynamics of the solar occultation technique for satellite measurements of the Earth's atmosphere are described. A one-year mission is simulated and the orbit and mission design implications are discussed in detail. Geographical coverage capabilities are examined parametrically for a range of orbit conditions. The hypothetical mission is used to produce a simulated one-year data base of solar occultation measurements; each occultation event is assumed to produce a single number, or 'measurement' and some statistical properties of the data set are examined. A simple model is fitted to the data to demonstrate a procedure for examining global distributions of atmospheric constitutents with the solar occultation technique.
Tackling sampling challenges in biomolecular simulations.
Barducci, Alessandro; Pfaendtner, Jim; Bonomi, Massimiliano
2015-01-01
Molecular dynamics (MD) simulations are a powerful tool to give an atomistic insight into the structure and dynamics of proteins. However, the time scales accessible in standard simulations, which often do not match those in which interesting biological processes occur, limit their predictive capabilities. Many advanced sampling techniques have been proposed over the years to overcome this limitation. This chapter focuses on metadynamics, a method based on the introduction of a time-dependent bias potential to accelerate sampling and recover equilibrium properties of a few descriptors that are able to capture the complexity of a process at a coarse-grained level. The theory of metadynamics and its combination with other popular sampling techniques such as the replica exchange method is briefly presented. Practical applications of these techniques to the study of the Trp-Cage miniprotein folding are also illustrated. The examples contain a guide for performing these calculations with PLUMED, a plugin to perform enhanced sampling simulations in combination with many popular MD codes.
NASA Technical Reports Server (NTRS)
Craun, Robert W.; Acosta, Diana M.; Beard, Steven D.; Leonard, Michael W.; Hardy, Gordon H.; Weinstein, Michael; Yildiz, Yildiray
2013-01-01
This paper describes the maturation of a control allocation technique designed to assist pilots in the recovery from pilot induced oscillations (PIOs). The Control Allocation technique to recover from Pilot Induced Oscillations (CAPIO) is designed to enable next generation high efficiency aircraft designs. Energy efficient next generation aircraft require feedback control strategies that will enable lowering the actuator rate limit requirements for optimal airframe design. One of the common issues flying with actuator rate limits is PIOs caused by the phase lag between the pilot inputs and control surface response. CAPIO utilizes real-time optimization for control allocation to eliminate phase lag in the system caused by control surface rate limiting. System impacts of the control allocator were assessed through a piloted simulation evaluation of a non-linear aircraft simulation in the NASA Ames Vertical Motion Simulator. Results indicate that CAPIO helps reduce oscillatory behavior, including the severity and duration of PIOs, introduced by control surface rate limiting.
NASA Technical Reports Server (NTRS)
Bakuckas, J. G.; Tan, T. M.; Lau, A. C. W.; Awerbuch, J.
1993-01-01
A finite element-based numerical technique has been developed to simulate damage growth in unidirectional composites. This technique incorporates elastic-plastic analysis, micromechanics analysis, failure criteria, and a node splitting and node force relaxation algorithm to create crack surfaces. Any combination of fiber and matrix properties can be used. One of the salient features of this technique is that damage growth can be simulated without pre-specifying a crack path. In addition, multiple damage mechanisms in the forms of matrix cracking, fiber breakage, fiber-matrix debonding and plastic deformation are capable of occurring simultaneously. The prevailing failure mechanism and the damage (crack) growth direction are dictated by the instantaneous near-tip stress and strain fields. Once the failure mechanism and crack direction are determined, the crack is advanced via the node splitting and node force relaxation algorithm. Simulations of the damage growth process in center-slit boron/aluminum and silicon carbide/titanium unidirectional specimens were performed. The simulation results agreed quite well with the experimental observations.
Developing integrated patient pathways using hybrid simulation
NASA Astrophysics Data System (ADS)
Zulkepli, Jafri; Eldabi, Tillal
2016-10-01
Integrated patient pathways includes several departments, i.e. healthcare which includes emergency care and inpatient ward; intermediate care which patient(s) will stay for a maximum of two weeks and at the same time be assessed by assessment team to find the most suitable care; and social care. The reason behind introducing the intermediate care in western countries was to reduce the rate of patients that stays in the hospital especially for elderly patients. This type of care setting has been considered to be set up in some other countries including Malaysia. Therefore, to assess the advantages of introducing this type of integrated healthcare setting, we suggest develop the model using simulation technique. We argue that single simulation technique is not viable enough to represent this type of patient pathways. Therefore, we suggest develop this model using hybrid techniques, i.e. System Dynamics (SD) and Discrete Event Simulation (DES). Based on hybrid model result, we argued that the result is viable to be as references for decision making process.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Using cognitive architectures to study issues in team cognition in a complex task environment
NASA Astrophysics Data System (ADS)
Smart, Paul R.; Sycara, Katia; Tang, Yuqing
2014-05-01
Cognitive social simulation is a computer simulation technique that aims to improve our understanding of the dynamics of socially-situated and socially-distributed cognition. This makes cognitive social simulation techniques particularly appealing as a means to undertake experiments into team cognition. The current paper reports on the results of an ongoing effort to develop a cognitive social simulation capability that can be used to undertake studies into team cognition using the ACT-R cognitive architecture. This capability is intended to support simulation experiments using a team-based problem solving task, which has been used to explore the effect of different organizational environments on collective problem solving performance. The functionality of the ACT-R-based cognitive social simulation capability is presented and a number of areas of future development work are outlined. The paper also describes the motivation for adopting cognitive architectures in the context of social simulation experiments and presents a number of research areas where cognitive social simulation may be useful in developing a better understanding of the dynamics of team cognition. These include the use of cognitive social simulation to study the role of cognitive processes in determining aspects of communicative behavior, as well as the impact of communicative behavior on the shaping of task-relevant cognitive processes (e.g., the social shaping of individual and collective memory as a result of communicative exchanges). We suggest that the ability to perform cognitive social simulation experiments in these areas will help to elucidate some of the complex interactions that exist between cognitive, social, technological and informational factors in the context of team-based problem-solving activities.
Confidence Intervals from Realizations of Simulated Nuclear Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younes, W.; Ratkiewicz, A.; Ressler, J. J.
2017-09-28
Various statistical techniques are discussed that can be used to assign a level of confidence in the prediction of models that depend on input data with known uncertainties and correlations. The particular techniques reviewed in this paper are: 1) random realizations of the input data using Monte-Carlo methods, 2) the construction of confidence intervals to assess the reliability of model predictions, and 3) resampling techniques to impose statistical constraints on the input data based on additional information. These techniques are illustrated with a calculation of the keff value, based on the 235U(n, f) and 239Pu (n, f) cross sections.
Flexible multibody simulation of automotive systems with non-modal model reduction techniques
NASA Astrophysics Data System (ADS)
Shiiba, Taichi; Fehr, Jörg; Eberhard, Peter
2012-12-01
The stiffness of the body structure of an automobile has a strong relationship with its noise, vibration, and harshness (NVH) characteristics. In this paper, the effect of the stiffness of the body structure upon ride quality is discussed with flexible multibody dynamics. In flexible multibody simulation, the local elastic deformation of the vehicle has been described traditionally with modal shape functions. Recently, linear model reduction techniques from system dynamics and mathematics came into the focus to find more sophisticated elastic shape functions. In this work, the NVH-relevant states of a racing kart are simulated, whereas the elastic shape functions are calculated with modern model reduction techniques like moment matching by projection on Krylov-subspaces, singular value decomposition-based reduction techniques, and combinations of those. The whole elastic multibody vehicle model consisting of tyres, steering, axle, etc. is considered, and an excitation with a vibration characteristics in a wide frequency range is evaluated in this paper. The accuracy and the calculation performance of those modern model reduction techniques is investigated including a comparison of the modal reduction approach.
NASA Technical Reports Server (NTRS)
Grantham, W. D.; Deal, P. L.
1974-01-01
A fixed-base simulator study was conducted to determine the minimum acceptable level of longitudinal stability for a representative turbofan STOL (short take-off and landing) transport airplane during the landing approach. Real-time digital simulation techniques were used. The computer was programed with equations of motion for six degrees of freedom, and the aerodynamic inputs were based on measured wind-tunnel data. The primary piloting task was an instrument approach to a breakout at a 60-m (200-ft) ceiling.
NASA Astrophysics Data System (ADS)
Mukhtar, Maseeh; Thiel, Bradley
2018-03-01
In fabrication, overlay measurements of semiconductor device patterns have conventionally been performed using optical methods. Beginning with image-based techniques using box-in-box to the more recent diffraction-based overlay (DBO). Alternatively, use of SEM overlay is under consideration for in-device overlay. Two main application spaces are measurement features from multiple mask levels on the same surface and buried features. Modern CD-SEMs are adept at measuring overlay for cases where all features are on the surface. In order to measure overlay of buried features, HV-SEM is needed. Gate-to-fin and BEOL overlay are important use cases for this technique. A JMONSEL simulation exercise was performed for these two cases using 10 nm line/space gratings of graduated increase in depth of burial. Backscattered energy loss results of these simulations were used to calculate the sensitivity measurements of buried features versus electron dosage for an array of electron beam voltages.
A typology of educationally focused medical simulation tools.
Alinier, Guillaume
2007-10-01
The concept of simulation as an educational tool in healthcare is not a new idea but its use has really blossomed over the last few years. This enthusiasm is partly driven by an attempt to increase patient safety and also because the technology is becoming more affordable and advanced. Simulation is becoming more commonly used for initial training purposes as well as for continuing professional development, but people often have very different perceptions of the definition of the term simulation, especially in an educational context. This highlights the need for a clear classification of the technology available but also about the method and teaching approach employed. The aims of this paper are to discuss the current range of simulation approaches and propose a clear typology of simulation teaching aids. Commonly used simulation techniques have been identified and discussed in order to create a classification that reports simulation techniques, their usual mode of delivery, the skills they can address, the facilities required, their typical use, and their pros and cons. This paper presents a clear classification scheme of educational simulation tools and techniques with six different technological levels. They are respectively: written simulations, three-dimensional models, screen-based simulators, standardized patients, intermediate fidelity patient simulators, and interactive patient simulators. This typology allows the accurate description of the simulation technology and the teaching methods applied. Thus valid comparison of educational tools can be made as to their potential effectiveness and verisimilitude at different training stages. The proposed typology of simulation methodologies available for educational purposes provides a helpful guide for educators and participants which should help them to realise the potential learning outcomes at different technological simulation levels in relation to the training approach employed. It should also be a useful resource for simulation users who are trying to improve their educational practice.
Geothermal reservoir simulation
NASA Technical Reports Server (NTRS)
Mercer, J. W., Jr.; Faust, C.; Pinder, G. F.
1974-01-01
The prediction of long-term geothermal reservoir performance and the environmental impact of exploiting this resource are two important problems associated with the utilization of geothermal energy for power production. Our research effort addresses these problems through numerical simulation. Computer codes based on the solution of partial-differential equations using finite-element techniques are being prepared to simulate multiphase energy transport, energy transport in fractured porous reservoirs, well bore phenomena, and subsidence.
Determination of mixed mode (I/II) SIFs of cracked orthotropic materials
NASA Astrophysics Data System (ADS)
Chakraborty, D.; Chakraborty, Debaleena; Murthy, K. S. R. K.
2018-05-01
Strain gage techniques have been successfully but sparsely used for the determination of stress intensity factors (SIFs) of orthotropic materials. For mode I cases, few works have been reported on the strain gage based determination of mode I SIF of orthotropic materials. However, for mixed mode (I/II) cases, neither a theoretical development of a strain gage based technique nor any recommended guidelines for minimum number of strain gages and their locations were reported in the literature for determination of mixed mode SIFs. The authors for the first time came up with a theoretical proposition to successfully use strain gages for determination of mixed mode SIFs of orthotropic materials [1]. Based on these formulations, the present paper discusses a finite element (FE) based numerical simulation of the proposed strain gage technique employing [902/0]10S carbon-epoxy laminates with a slant edge crack. An FE based procedure has also been presented for determination of the optimal radial locations of the strain gages apriori to actual experiments. To substantiate the efficacy of the proposed technique, numerical simulations for strain gage based determination of mixed mode SIFs have been conducted. Results show that it is possible to accurately determine the mixed mode SIFs of orthotropic laminates when the strain gages are placed within the optimal radial locations estimated using the present formulation.
NASA Astrophysics Data System (ADS)
Lucas, Charles E.; Walters, Eric A.; Jatskevich, Juri; Wasynczuk, Oleg; Lamm, Peter T.
2003-09-01
In this paper, a new technique useful for the numerical simulation of large-scale systems is presented. This approach enables the overall system simulation to be formed by the dynamic interconnection of the various interdependent simulations, each representing a specific component or subsystem such as control, electrical, mechanical, hydraulic, or thermal. Each simulation may be developed separately using possibly different commercial-off-the-shelf simulation programs thereby allowing the most suitable language or tool to be used based on the design/analysis needs. These subsystems communicate the required interface variables at specific time intervals. A discussion concerning the selection of appropriate communication intervals is presented herein. For the purpose of demonstration, this technique is applied to a detailed simulation of a representative aircraft power system, such as that found on the Joint Strike Fighter (JSF). This system is comprised of ten component models each developed using MATLAB/Simulink, EASY5, or ACSL. When the ten component simulations were distributed across just four personal computers (PCs), a greater than 15-fold improvement in simulation speed (compared to the single-computer implementation) was achieved.
NASA Technical Reports Server (NTRS)
Grantham, W. D.; Nguyen, L. T.; Patton, J. M., Jr.; Deal, P. L.; Champine, R. A.; Carter, C. R.
1972-01-01
A fixed-base simulator study was conducted to determine the flight characteristics of a representative STOL transport having a high wing and equipped with an external-flow jet flap in combination with four high-bypass-ratio fan-jet engines during the approach and landing. Real-time digital simulation techniques were used. The computer was programed with equations of motion for six degrees of freedom and the aerodynamic inputs were based on measured wind-tunnel data. A visual display of a STOL airport was provided for simulation of the flare and touchdown characteristics. The primary piloting task was an instrument approach to a breakout at a 200-ft ceiling with a visual landing.
Technology for Transient Simulation of Vibration during Combustion Process in Rocket Thruster
NASA Astrophysics Data System (ADS)
Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.
2018-01-01
The article describes the technology for simulation of transient combustion processes in the rocket thruster for determination of vibration frequency occurs during combustion. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. The way to generate the Flamelet library with CFX-RIF was described. A technique for modeling transient combustion processes in the rocket thruster was proposed based on the Flamelet library. A cyclic irregularity of the temperature field like vortex core precession was detected in the chamber. Frequency of flame precession was obtained with the proposed simulation technique.
Diez, P; Hoskin, P J; Aird, E G A
2007-10-01
This questionnaire forms the basis of the quality assurance (QA) programme for the UK randomized Phase III study of the Stanford V regimen versus ABVD for treatment of advanced Hodgkin's disease to assess differences between participating centres in treatment planning and delivery of involved-field radiotherapy for Hodgkin's lymphoma The questionnaire, which was circulated amongst 42 participating centres, consisted of seven sections: target volume definition and dose prescription; critical structures; patient positioning and irradiation techniques; planning; dose calculation; verification; and future developments The results are based on 25 responses. One-third plan using CT alone, one-third use solely the simulator and the rest individualize, depending on disease site. Eleven centres determine a dose distribution for each patient. Technique depends on disease site and whether CT or simulator planning is employed. Most departments apply isocentric techniques and use immobilization and customized shielding. In vivo dosimetry is performed in 7 centres and treatment verification occurs in 24 hospitals. In conclusion, the planning and delivery of treatment for lymphoma patients varies across the country. Conventional planning is still widespread but most centres are moving to CT-based planning and virtual simulation with extended use of immobilization, customized shielding and compensation.
Roemer, R B; Booth, D; Bhavsar, A A; Walter, G H; Terry, L I
2012-12-21
A mathematical model based on conservation of energy has been developed and used to simulate the temperature responses of cones of the Australian cycads Macrozamia lucida and Macrozamia. macleayi during their daily thermogenic cycle. These cones generate diel midday thermogenic temperature increases as large as 12 °C above ambient during their approximately two week pollination period. The cone temperature response model is shown to accurately predict the cones' temperatures over multiple days as based on simulations of experimental results from 28 thermogenic events from 3 different cones, each simulated for either 9 or 10 sequential days. The verified model is then used as the foundation of a new, parameter estimation based technique (termed inverse calorimetry) that estimates the cones' daily metabolic heating rates from temperature measurements alone. The inverse calorimetry technique's predictions of the major features of the cones' thermogenic metabolism compare favorably with the estimates from conventional respirometry (indirect calorimetry). Because the new technique uses only temperature measurements, and does not require measurements of oxygen consumption, it provides a simple, inexpensive and portable complement to conventional respirometry for estimating metabolic heating rates. It thus provides an additional tool to facilitate field and laboratory investigations of the bio-physics of thermogenic plants. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bonechi, L.; D'Alessandro, R.; Mori, N.; Viliani, L.
2015-02-01
Muon absorption radiography is an imaging technique based on the analysis of the attenuation of the cosmic-ray muon flux after traversing an object under examination. While this technique is now reaching maturity in the field of volcanology for the imaging of the innermost parts of the volcanic cones, its applicability to other fields of research has not yet been proved. In this paper we present a study concerning the application of the muon absorption radiography technique to the field of archaeology, and we propose a method for the search of underground cavities and structures hidden a few metres deep in the soil (patent [1]). An original geometric treatment of the reconstructed muon tracks, based on the comparison of the measured flux with a reference simulated flux, and the preliminary results of specific simulations are discussed in details.
Estimation of Soil Moisture with L-band Multi-polarization Radar
NASA Technical Reports Server (NTRS)
Shi, J.; Chen, K. S.; Kim, Chung-Li Y.; Van Zyl, J. J.; Njoku, E.; Sun, G.; O'Neill, P.; Jackson, T.; Entekhabi, D.
2004-01-01
Through analyses of the model simulated data-base, we developed a technique to estimate surface soil moisture under HYDROS radar sensor (L-band multi-polarizations and 40deg incidence) configuration. This technique includes two steps. First, it decomposes the total backscattering signals into two components - the surface scattering components (the bare surface backscattering signals attenuated by the overlaying vegetation layer) and the sum of the direct volume scattering components and surface-volume interaction components at different polarizations. From the model simulated data-base, our decomposition technique works quit well in estimation of the surface scattering components with RMSEs of 0.12,0.25, and 0.55 dB for VV, HH, and VH polarizations, respectively. Then, we use the decomposed surface backscattering signals to estimate the soil moisture and the combined surface roughness and vegetation attenuation correction factors with all three polarizations.
Maleke, Caroline; Luo, Jianwen; Gamarnik, Viktor; Lu, Xin L; Konofagou, Elisa E
2010-07-01
The objective of this study is to show that Harmonic Motion Imaging (HMI) can be used as a reliable tumor-mapping technique based on the tumor's distinct stiffness at the early onset of disease. HMI is a radiation-force-based imaging method that generates a localized vibration deep inside the tissue to estimate the relative tissue stiffness based on the resulting displacement amplitude. In this paper, a finite-element model (FEM) study is presented, followed by an experimental validation in tissue-mimicking polyacrylamide gels and excised human breast tumors ex vivo. This study compares the resulting tissue motion in simulations and experiments at four different gel stiffnesses and three distinct spherical inclusion diameters. The elastic moduli of the gels were separately measured using mechanical testing. Identical transducer parameters were used in both the FEM and experimental studies, i.e., a 4.5-MHz single-element focused ultrasound (FUS) and a 7.5-MHz diagnostic (pulse-echo) transducer. In the simulation, an acoustic pressure field was used as the input stimulus to generate a localized vibration inside the target. Radiofrequency (rf) signals were then simulated using a 2D convolution model. A one-dimensional cross-correlation technique was performed on the simulated and experimental rf signals to estimate the axial displacement resulting from the harmonic radiation force. In order to measure the reliability of the displacement profiles in estimating the tissue stiffness distribution, the contrast-transfer efficiency (CTE) was calculated. For tumor mapping ex vivo, a harmonic radiation force was applied using a 2D raster-scan technique. The 2D HMI images of the breast tumor ex vivo could detect a malignant tumor (20 x 10 mm2) surrounded by glandular and fat tissues. The FEM and experimental results from both gels and breast tumors ex vivo demonstrated that HMI was capable of detecting and mapping the tumor or stiff inclusion with various diameters or stiffnesses. HMI may thus constitute a promising technique in tumor detection (>3 mm in diameter) and mapping based on its distinct stiffness.
Impact of Simulation Technology on Die and Stamping Business
NASA Astrophysics Data System (ADS)
Stevens, Mark W.
2005-08-01
Over the last ten years, we have seen an explosion in the use of simulation-based techniques to improve the engineering, construction, and operation of GM production tools. The impact has been as profound as the overall switch to CAD/CAM from the old manual design and construction methods. The changeover to N/C machining from duplicating milling machines brought advances in accuracy and speed to our construction activity. It also brought significant reductions in fitting sculptured surfaces. Changing over to CAD design brought similar advances in accuracy, and today's use of solid modeling has enhanced that accuracy gain while finally leading to the reduction in lead time and cost through the development of parametric techniques. Elimination of paper drawings for die design, along with the process of blueprinting and distribution, provided the savings required to install high capacity computer servers, high-speed data transmission lines and integrated networks. These historic changes in the application of CAE technology in manufacturing engineering paved the way for the implementation of simulation to all aspects of our business. The benefits are being realized now, and the future holds even greater promise as the simulation techniques mature and expand. Every new line of dies is verified prior to casting for interference free operation. Sheet metal forming simulation validates the material flow, eliminating the high costs of physical experimentation dependent on trial and error methods of the past. Integrated forming simulation and die structural analysis and optimization has led to a reduction in die size and weight on the order of 30% or more. The latest techniques in factory simulation enable analysis of automated press lines, including all stamping operations with corresponding automation. This leads to manufacturing lines capable of running at higher levels of throughput, with actual results providing the capability of two or more additional strokes per minute. As we spread these simulation techniques to the balance of our business, from blank de-stacking to the racking of parts, we anticipate continued reduction in lead-time and engineering expense while improving quality and start-up execution. The author will provide an overview of technology and business evolution of the math-based process that brought an historical transition and revitalization to the die and stamping industry in the past decade. Finally, the author will give an outlook for future business needs and technology development directions.
Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.
2016-01-01
An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.
Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2000-01-01
A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.
Space construction base control system
NASA Technical Reports Server (NTRS)
1978-01-01
Aspects of an attitude control system were studied and developed for a large space base that is structurally flexible and whose mass properties change rather dramatically during its orbital lifetime. Topics of discussion include the following: (1) space base orbital pointing and maneuvering; (2) angular momentum sizing of actuators; (3) momentum desaturation selection and sizing; (4) multilevel control technique applied to configuration one; (5) one-dimensional model simulation; (6) N-body discrete coordinate simulation; (7) structural analysis math model formulation; and (8) discussion of control problems and control methods.
A histogram-based technique for rapid vector extraction from PIV photographs
NASA Technical Reports Server (NTRS)
Humphreys, William M., Jr.
1991-01-01
A new analysis technique, performed totally in the image plane, is proposed which rapidly extracts all available vectors from individual interrogation regions on PIV photographs. The technique avoids the need for using Fourier transforms with the associated computational burden. The data acquisition and analysis procedure is described, and results of a preliminary simulation study to evaluate the accuracy of the technique are presented. Recently obtained PIV photographs are analyzed.
NASA Astrophysics Data System (ADS)
Shao, Lin; Gigax, Jonathan; Chen, Di; Kim, Hyosim; Garner, Frank A.; Wang, Jing; Toloczko, Mychailo B.
2017-10-01
Self-ion irradiation is widely used as a method to simulate neutron damage in reactor structural materials. Accelerator-based simulation of void swelling, however, introduces a number of neutron-atypical features which require careful data extraction and, in some cases, introduction of innovative irradiation techniques to alleviate these issues. We briefly summarize three such atypical features: defect imbalance effects, pulsed beam effects, and carbon contamination. The latter issue has just been recently recognized as being relevant to simulation of void swelling and is discussed here in greater detail. It is shown that carbon ions are entrained in the ion beam by Coulomb force drag and accelerated toward the target surface. Beam-contaminant interactions are modeled using molecular dynamics simulation. By applying a multiple beam deflection technique, carbon and other contaminants can be effectively filtered out, as demonstrated in an irradiation of HT-9 alloy by 3.5 MeV Fe ions.
Scalable Methods for Eulerian-Lagrangian Simulation Applied to Compressible Multiphase Flows
NASA Astrophysics Data System (ADS)
Zwick, David; Hackl, Jason; Balachandar, S.
2017-11-01
Multiphase flows can be found in countless areas of physics and engineering. Many of these flows can be classified as dispersed two-phase flows, meaning that there are solid particles dispersed in a continuous fluid phase. A common technique for simulating such flow is the Eulerian-Lagrangian method. While useful, this method can suffer from scaling issues on larger problem sizes that are typical of many realistic geometries. Here we present scalable techniques for Eulerian-Lagrangian simulations and apply it to the simulation of a particle bed subjected to expansion waves in a shock tube. The results show that the methods presented here are viable for simulation of larger problems on modern supercomputers. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No. DGE-1315138. This work was supported in part by the U.S. Department of Energy under Contract No. DE-NA0002378.
Testing prediction methods: Earthquake clustering versus the Poisson model
Michael, A.J.
1997-01-01
Testing earthquake prediction methods requires statistical techniques that compare observed success to random chance. One technique is to produce simulated earthquake catalogs and measure the relative success of predicting real and simulated earthquakes. The accuracy of these tests depends on the validity of the statistical model used to simulate the earthquakes. This study tests the effect of clustering in the statistical earthquake model on the results. Three simulation models were used to produce significance levels for a VLF earthquake prediction method. As the degree of simulated clustering increases, the statistical significance drops. Hence, the use of a seismicity model with insufficient clustering can lead to overly optimistic results. A successful method must pass the statistical tests with a model that fully replicates the observed clustering. However, a method can be rejected based on tests with a model that contains insufficient clustering. U.S. copyright. Published in 1997 by the American Geophysical Union.
Theoretical Bound of CRLB for Energy Efficient Technique of RSS-Based Factor Graph Geolocation
NASA Astrophysics Data System (ADS)
Kahar Aziz, Muhammad Reza; Heriansyah; Saputra, EfaMaydhona; Musa, Ardiansyah
2018-03-01
To support the increase of wireless geolocation development as the key of the technology in the future, this paper proposes theoretical bound derivation, i.e., Cramer Rao lower bound (CRLB) for energy efficient of received signal strength (RSS)-based factor graph wireless geolocation technique. The theoretical bound derivation is crucially important to evaluate whether the energy efficient technique of RSS-based factor graph wireless geolocation is effective as well as to open the opportunity to further innovation of the technique. The CRLB is derived in this paper by using the Fisher information matrix (FIM) of the main formula of the RSS-based factor graph geolocation technique, which is lied on the Jacobian matrix. The simulation result shows that the derived CRLB has the highest accuracy as a bound shown by its lowest root mean squared error (RMSE) curve compared to the RMSE curve of the RSS-based factor graph geolocation technique. Hence, the derived CRLB becomes the lower bound for the efficient technique of RSS-based factor graph wireless geolocation.
Kulhánek, Tomáš; Ježek, Filip; Mateják, Marek; Šilar, Jan; Kofránek, Jří
2015-08-01
This work introduces experiences of teaching modeling and simulation for graduate students in the field of biomedical engineering. We emphasize the acausal and object-oriented modeling technique and we have moved from teaching block-oriented tool MATLAB Simulink to acausal and object oriented Modelica language, which can express the structure of the system rather than a process of computation. However, block-oriented approach is allowed in Modelica language too and students have tendency to express the process of computation. Usage of the exemplar acausal domains and approach allows students to understand the modeled problems much deeper. The causality of the computation is derived automatically by the simulation tool.
Accurate lithography simulation model based on convolutional neural networks
NASA Astrophysics Data System (ADS)
Watanabe, Yuki; Kimura, Taiki; Matsunawa, Tetsuaki; Nojima, Shigeki
2017-07-01
Lithography simulation is an essential technique for today's semiconductor manufacturing process. In order to calculate an entire chip in realistic time, compact resist model is commonly used. The model is established for faster calculation. To have accurate compact resist model, it is necessary to fix a complicated non-linear model function. However, it is difficult to decide an appropriate function manually because there are many options. This paper proposes a new compact resist model using CNN (Convolutional Neural Networks) which is one of deep learning techniques. CNN model makes it possible to determine an appropriate model function and achieve accurate simulation. Experimental results show CNN model can reduce CD prediction errors by 70% compared with the conventional model.
Fixed gain and adaptive techniques for rotorcraft vibration control
NASA Technical Reports Server (NTRS)
Roy, R. H.; Saberi, H. A.; Walker, R. A.
1985-01-01
The results of an analysis effort performed to demonstrate the feasibility of employing approximate dynamical models and frequency shaped cost functional control law desgin techniques for helicopter vibration suppression are presented. Both fixed gain and adaptive control designs based on linear second order dynamical models were implemented in a detailed Rotor Systems Research Aircraft (RSRA) simulation to validate these active vibration suppression control laws. Approximate models of fuselage flexibility were included in the RSRA simulation in order to more accurately characterize the structural dynamics. The results for both the fixed gain and adaptive approaches are promising and provide a foundation for pursuing further validation in more extensive simulation studies and in wind tunnel and/or flight tests.
Levecke, Bruno; De Wilde, Nathalie; Vandenhoute, Els; Vercruysse, Jozef
2009-01-01
Background Soil-transmitted helminths, such as Trichuris trichiura, are of major concern in public health. Current efforts to control these helminth infections involve periodic mass treatment in endemic areas. Since these large-scale interventions are likely to intensify, monitoring the drug efficacy will become indispensible. However, studies comparing detection techniques based on sensitivity, fecal egg counts (FEC), feasibility for mass diagnosis and drug efficacy estimates are scarce. Methodology/Principal Findings In the present study, the ether-based concentration, the Parasep Solvent Free (SF), the McMaster and the FLOTAC techniques were compared based on both validity and feasibility for the detection of Trichuris eggs in 100 fecal samples of nonhuman primates. In addition, the drug efficacy estimates of quantitative techniques was examined using a statistical simulation. Trichuris eggs were found in 47% of the samples. FLOTAC was the most sensitive technique (100%), followed by the Parasep SF (83.0% [95% confidence interval (CI): 82.4–83.6%]) and the ether-based concentration technique (76.6% [95% CI: 75.8–77.3%]). McMaster was the least sensitive (61.7% [95% CI: 60.7–62.6%]) and failed to detect low FEC. The quantitative comparison revealed a positive correlation between the four techniques (Rs = 0.85–0.93; p<0.0001). However, the ether-based concentration technique and the Parasep SF detected significantly fewer eggs than both the McMaster and the FLOTAC (p<0.0083). Overall, the McMaster was the most feasible technique (3.9 min/sample for preparing, reading and cleaning of the apparatus), followed by the ether-based concentration technique (7.7 min/sample) and the FLOTAC (9.8 min/sample). Parasep SF was the least feasible (17.7 min/sample). The simulation revealed that the sensitivity is less important for monitoring drug efficacy and that both FLOTAC and McMaster were reliable estimators. Conclusions/Significance The results of this study demonstrated that McMaster is a promising technique when making use of FEC to monitor drug efficacy in Trichuris. PMID:19172171
A review of flight simulation techniques
NASA Astrophysics Data System (ADS)
Baarspul, Max
After a brief historical review of the evolution of flight simulation techniques, this paper first deals with the main areas of flight simulator applications. Next, it describes the main components of a piloted flight simulator. Because of the presence of the pilot-in-the-loop, the digital computer driving the simulator must solve the aircraft equations of motion in ‘real-time’. Solutions to meet the high required computer power of todays modern flight simulator are elaborated. The physical similarity between aircraft and simulator in cockpit layout, flight instruments, flying controls etc., is discussed, based on the equipment and environmental cue fidelity required for training and research simulators. Visual systems play an increasingly important role in piloted flight simulation. The visual systems now available and most widely used are described, where image generators and display devices will be distinguished. The characteristics of out-of-the-window visual simulation systems pertaining to the perceptual capabilities of human vision are discussed. Faithful reproduction of aircraft motion requires large travel, velocity and acceleration capabilities of the motion system. Different types and applications of motion systems in e.g. airline training and research are described. The principles of motion cue generation, based on the characteristics of the non-visual human motion sensors, are described. The complete motion system, consisting of the hardware and the motion drive software, is discussed. The principles of mathematical modelling of the aerodynamic, flight control, propulsion, landing gear and environmental characteristics of the aircraft are reviewed. An example of the identification of an aircraft mathematical model, based on flight and taxi tests, is presented. Finally, the paper deals with the hardware and software integration of the flight simulator components and the testing and acceptance of the complete flight simulator. Examples of the so-called ‘Computer Generated Checkout’ and ‘Proof of Match’ are presented. The concluding remarks briefly summarize the status of flight simulator technology and consider possibilities for future research.
Opto-electronic characterization of third-generation solar cells
Jenatsch, Sandra
2018-01-01
Abstract We present an overview of opto-electronic characterization techniques for solar cells including light-induced charge extraction by linearly increasing voltage, impedance spectroscopy, transient photovoltage, charge extraction and more. Guidelines for the interpretation of experimental results are derived based on charge drift-diffusion simulations of solar cells with common performance limitations. It is investigated how nonidealities like charge injection barriers, traps and low mobilities among others manifest themselves in each of the studied cell characterization techniques. Moreover, comprehensive parameter extraction for an organic bulk-heterojunction solar cell comprising PCDTBT:PC70BM is demonstrated. The simulations reproduce measured results of 9 different experimental techniques. Parameter correlation is minimized due to the combination of various techniques. Thereby a route to comprehensive and accurate parameter extraction is identified. PMID:29707069
A Monte Carlo Simulation of Brownian Motion in the Freshman Laboratory
ERIC Educational Resources Information Center
Anger, C. D.; Prescott, J. R.
1970-01-01
Describes a dry- lab" experiment for the college freshman laboratory, in which the essential features of Browian motion are given principles, using the Monte Carlo technique. Calculations principles, using the Monte Carlo technique. Calculations are carried out by a computation sheme based on computer language. Bibliography. (LC)
Wilhelm, Jan; Walz, Michael; Stendel, Melanie; Bagrets, Alexei; Evers, Ferdinand
2013-05-14
We present a modification of the standard electron transport methodology based on the (non-equilibrium) Green's function formalism to efficiently simulate STM-images. The novel feature of this method is that it employs an effective embedding technique that allows us to extrapolate properties of metal substrates with adsorbed molecules from quantum-chemical cluster calculations. To illustrate the potential of this approach, we present an application to STM-images of C58-dimers immobilized on Au(111)-surfaces that is motivated by recent experiments.
NASA Astrophysics Data System (ADS)
Bahl, Mayank; Zhou, Gui-Rong; Heller, Evan; Cassarly, William; Jiang, Mingming; Scarmozzino, Rob; Gregory, G. Groot
2014-09-01
Over the last two decades there has been extensive research done to improve the design of Organic Light Emitting Diodes (OLEDs) so as to enhance light extraction efficiency, improve beam shaping, and allow color tuning through techniques such as the use of patterned substrates, photonic crystal (PCs) gratings, back reflectors, surface texture, and phosphor down-conversion. Computational simulation has been an important tool for examining these increasingly complex designs. It has provided insights for improving OLED performance as a result of its ability to explore limitations, predict solutions, and demonstrate theoretical results. Depending upon the focus of the design and scale of the problem, simulations are carried out using rigorous electromagnetic (EM) wave optics based techniques, such as finite-difference time-domain (FDTD) and rigorous coupled wave analysis (RCWA), or through ray optics based technique such as Monte Carlo ray-tracing. The former are typically used for modeling nanostructures on the OLED die, and the latter for modeling encapsulating structures, die placement, back-reflection, and phosphor down-conversion. This paper presents the use of a mixed-level simulation approach which unifies the use of EM wave-level and ray-level tools. This approach uses rigorous EM wave based tools to characterize the nanostructured die and generate both a Bidirectional Scattering Distribution function (BSDF) and a far-field angular intensity distribution. These characteristics are then incorporated into the ray-tracing simulator to obtain the overall performance. Such mixed-level approach allows for comprehensive modeling of the optical characteristic of OLEDs and can potentially lead to more accurate performance than that from individual modeling tools alone.
NASA Technical Reports Server (NTRS)
Shackelford, John H.; Saugen, John D.; Wurst, Michael J.; Adler, James
1991-01-01
A generic planar 3 degree of freedom simulation was developed that supports hardware in the loop simulations, guidance and control analysis, and can directly generate flight software. This simulation was developed in a small amount of time utilizing rapid prototyping techniques. The approach taken to develop this simulation tool, the benefits seen using this approach to development, and on-going efforts to improve and extend this capability are described. The simulation is composed of 3 major elements: (1) Docker dynamics model, (2) Dockee dynamics model, and (3) Docker Control System. The docker and dockee models are based on simple planar orbital dynamics equations using a spherical earth gravity model. The docker control system is based on a phase plane approach to error correction.
NASA Astrophysics Data System (ADS)
Ziss, Dorian; Martín-Sánchez, Javier; Lettner, Thomas; Halilovic, Alma; Trevisi, Giovanna; Trotta, Rinaldo; Rastelli, Armando; Stangl, Julian
2017-04-01
In this paper, strain transfer efficiencies from a single crystalline piezoelectric lead magnesium niobate-lead titanate substrate to a GaAs semiconductor membrane bonded on top are investigated using state-of-the-art x-ray diffraction (XRD) techniques and finite-element-method (FEM) simulations. Two different bonding techniques are studied, namely, gold-thermo-compression and polymer-based SU8 bonding. Our results show a much higher strain-transfer for the "soft" SU8 bonding in comparison to the "hard" bonding via gold-thermo-compression. A comparison between the XRD results and FEM simulations allows us to explain this unexpected result with the presence of complex interface structures between the different layers.
Ziss, Dorian; Martín-Sánchez, Javier; Lettner, Thomas; Halilovic, Alma; Trevisi, Giovanna; Trotta, Rinaldo; Rastelli, Armando; Stangl, Julian
2017-01-01
In this paper, strain transfer efficiencies from a single crystalline piezoelectric lead magnesium niobate-lead titanate substrate to a GaAs semiconductor membrane bonded on top are investigated using state-of-the-art x-ray diffraction (XRD) techniques and finite-element-method (FEM) simulations. Two different bonding techniques are studied, namely, gold-thermo-compression and polymer-based SU8 bonding. Our results show a much higher strain-transfer for the “soft” SU8 bonding in comparison to the “hard” bonding via gold-thermo-compression. A comparison between the XRD results and FEM simulations allows us to explain this unexpected result with the presence of complex interface structures between the different layers. PMID:28522879
Ziss, Dorian; Martín-Sánchez, Javier; Lettner, Thomas; Halilovic, Alma; Trevisi, Giovanna; Trotta, Rinaldo; Rastelli, Armando; Stangl, Julian
2017-04-01
In this paper, strain transfer efficiencies from a single crystalline piezoelectric lead magnesium niobate-lead titanate substrate to a GaAs semiconductor membrane bonded on top are investigated using state-of-the-art x-ray diffraction (XRD) techniques and finite-element-method (FEM) simulations. Two different bonding techniques are studied, namely, gold-thermo-compression and polymer-based SU8 bonding. Our results show a much higher strain-transfer for the "soft" SU8 bonding in comparison to the "hard" bonding via gold-thermo-compression. A comparison between the XRD results and FEM simulations allows us to explain this unexpected result with the presence of complex interface structures between the different layers.
Information hiding based on double random-phase encoding and public-key cryptography.
Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li
2009-03-02
A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.
Radiometrically accurate scene-based nonuniformity correction for array sensors.
Ratliff, Bradley M; Hayat, Majeed M; Tyo, J Scott
2003-10-01
A novel radiometrically accurate scene-based nonuniformity correction (NUC) algorithm is described. The technique combines absolute calibration with a recently reported algebraic scene-based NUC algorithm. The technique is based on the following principle: First, detectors that are along the perimeter of the focal-plane array are absolutely calibrated; then the calibration is transported to the remaining uncalibrated interior detectors through the application of the algebraic scene-based algorithm, which utilizes pairs of image frames exhibiting arbitrary global motion. The key advantage of this technique is that it can obtain radiometric accuracy during NUC without disrupting camera operation. Accurate estimates of the bias nonuniformity can be achieved with relatively few frames, which can be fewer than ten frame pairs. Advantages of this technique are discussed, and a thorough performance analysis is presented with use of simulated and real infrared imagery.
NASA Astrophysics Data System (ADS)
Butler, George; Pemberton, Steven
2017-06-01
Modeling and simulation is extremely important in the design and formulation of new explosives and explosive devices due to the high cost of experiment-based development. However, the efficacy of simulations depends on the accuracy of the equations of state (EOS) and reactive burn models used to characterize the energetic materials. We investigate the possibility of using the components of an explosive fill as discrete elements in a simulation, based on the relative amounts of the constituents. This is accomplished by assembling a mosaic, or ``checkerboard'', in which each cell comprises the relative amounts of the constituents as in the mixture; it is assumed that each constituent has a well-defined set of simulation parameters. We do not consider the underlying microstructure, and recognize there will be limitations to the usefulness of this technique. We are interested in determining whether there are applications for this technique that might prove useful. As a test of the concept, two binary explosives were considered. We considered shapes for a periodic cellular structure and compared results from the checkerboards with those of the baseline explosives; detonation rates, cylinder expansion, and gap test predictions were compared.
NASA Astrophysics Data System (ADS)
Paganini, Michela; de Oliveira, Luke; Nachman, Benjamin
2018-01-01
The precise modeling of subatomic particle interactions and propagation through matter is paramount for the advancement of nuclear and particle physics searches and precision measurements. The most computationally expensive step in the simulation pipeline of a typical experiment at the Large Hadron Collider (LHC) is the detailed modeling of the full complexity of physics processes that govern the motion and evolution of particle showers inside calorimeters. We introduce CaloGAN, a new fast simulation technique based on generative adversarial networks (GANs). We apply these neural networks to the modeling of electromagnetic showers in a longitudinally segmented calorimeter and achieve speedup factors comparable to or better than existing full simulation techniques on CPU (100 ×-1000 × ) and even faster on GPU (up to ˜105× ). There are still challenges for achieving precision across the entire phase space, but our solution can reproduce a variety of geometric shower shape properties of photons, positrons, and charged pions. This represents a significant stepping stone toward a full neural network-based detector simulation that could save significant computing time and enable many analyses now and in the future.
Shock and vibration technology with applications to electrical systems
NASA Technical Reports Server (NTRS)
Eshleman, R. L.
1972-01-01
A survey is presented of shock and vibration technology for electrical systems developed by the aerospace programs. The shock environment is surveyed along with new techniques for modeling, computer simulation, damping, and response analysis. Design techniques based on the use of analog computers, shock spectra, optimization, and nonlinear isolation are discussed. Shock mounting of rotors for performance and survival, and vibration isolation techniques are reviewed.
Simplified nonplanar wafer bonding for heterogeneous device integration
NASA Astrophysics Data System (ADS)
Geske, Jon; Bowers, John E.; Riley, Anton
2004-07-01
We demonstrate a simplified nonplanar wafer bonding technique for heterogeneous device integration. The improved technique can be used to laterally integrate dissimilar semiconductor device structures on a lattice-mismatched substrate. Using the technique, two different InP-based vertical-cavity surface-emitting laser active regions have been integrated onto GaAs without compromising the quality of the photoluminescence. Experimental and numerical simulation results are presented.
All-atom Simulation of Amyloid Aggregates
NASA Astrophysics Data System (ADS)
Berhanu, Workalemahu M.; Alred, Erik J.; Bernhardt, Nathan A.; Hansmann, Ulrich H. E.
Molecular simulations are now commonly used to complement experiments in the investigation of amyloid formation and their role in human diseases. While various simulations based on enhanced sampling techniques are used in amyloid formation simulations, this article will focus on those using standard atomistic simulations to evaluate the stability of fibril models. Such studies explore the limitations that arise from the choice of force field or polymorphism; and explore the stability of in vivo and in vitro forms of Aβ fibril aggregates, and the role of heterologous seeding as a link between different amyloid diseases.
Numerical simulation of the SAGD process coupled with geomechanical behavior
NASA Astrophysics Data System (ADS)
Li, Pingke
Canada has vast oil sand resources. While a large portion of this resource can be recovered by surface mining techniques, a majority is located at depths requiring the application of in situ recovery technologies. Although a number of in situ recovery technologies exist, the steam assisted gravity drainage (SAGD) process has emerged as one of the most promising technologies to develop the in situ oil sands resources. During the SAGD operations, saturated steam is continuously injected into the oil sands reservoir, which induces pore pressure and stress variations. As a result, reservoir parameters and processes may also vary, particularly when tensile and shear failure occur. This geomechanical effect is obvious for oil sands material because oil sands have the in situ interlocked fabric. The conventional reservoir simulation generally does not take this coupled mechanism into consideration. Therefore, this research is to improve the reservoir simulation techniques of the SAGD process applied in the development of oil sands and heavy oil reservoirs. The analyses of the decoupled reservoir geomechanical simulation results show that the geomechanical behavior in SAGD has obvious impact on reservoir parameters, such as absolute permeability. The issues with the coupled reservoir geomechanical simulations of the SAGD process have been clarified and the permeability variations due to geomechanical behaviors in the SAGD process investigated. A methodology of sequentially coupled reservoir geomechanical simulation technique was developed based on the reservoir simulator, EXOTHERM, and the geomechanical simulator, FLAC. In addition, a representative geomechanical model of oil sands material was summarized in this research. Finally, this reservoir geomechanical simulation methodology was verified with the UTF Phase A SAGD project and applied in a SAGD operation with gas-over-bitumen geometry. Based on this methodology, the geomechanical effect on the SAGD production performance can be quantified. This research program involves the analyses of laboratory testing results obtained from literatures. However, no laboratory testing was conducted in the process of this research.
Hybrid modeling method for a DEP based particle manipulation.
Miled, Mohamed Amine; Gagne, Antoine; Sawan, Mohamad
2013-01-30
In this paper, a new modeling approach for Dielectrophoresis (DEP) based particle manipulation is presented. The proposed method fulfills missing links in finite element modeling between the multiphysic simulation and the biological behavior. This technique is amongst the first steps to develop a more complex platform covering several types of manipulations such as magnetophoresis and optics. The modeling approach is based on a hybrid interface using both ANSYS and MATLAB to link the propagation of the electrical field in the micro-channel to the particle motion. ANSYS is used to simulate the electrical propagation while MATLAB interprets the results to calculate cell displacement and send the new information to ANSYS for another turn. The beta version of the proposed technique takes into account particle shape, weight and its electrical properties. First obtained results are coherent with experimental results.
Hybrid Modeling Method for a DEP Based Particle Manipulation
Miled, Mohamed Amine; Gagne, Antoine; Sawan, Mohamad
2013-01-01
In this paper, a new modeling approach for Dielectrophoresis (DEP) based particle manipulation is presented. The proposed method fulfills missing links in finite element modeling between the multiphysic simulation and the biological behavior. This technique is amongst the first steps to develop a more complex platform covering several types of manipulations such as magnetophoresis and optics. The modeling approach is based on a hybrid interface using both ANSYS and MATLAB to link the propagation of the electrical field in the micro-channel to the particle motion. ANSYS is used to simulate the electrical propagation while MATLAB interprets the results to calculate cell displacement and send the new information to ANSYS for another turn. The beta version of the proposed technique takes into account particle shape, weight and its electrical properties. First obtained results are coherent with experimental results. PMID:23364197
Simulations of x-ray speckle-based dark-field and phase-contrast imaging with a polychromatic beam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zdora, Marie-Christine, E-mail: marie-christine.zdora@diamond.ac.uk; Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE; Department of Physics & Astronomy, University College London, London WC1E 6BT
2015-09-21
Following the first experimental demonstration of x-ray speckle-based multimodal imaging using a polychromatic beam [I. Zanette et al., Phys. Rev. Lett. 112(25), 253903 (2014)], we present a simulation study on the effects of a polychromatic x-ray spectrum on the performance of this technique. We observe that the contrast of the near-field speckles is only mildly influenced by the bandwidth of the energy spectrum. Moreover, using a homogeneous object with simple geometry, we characterize the beam hardening artifacts in the reconstructed transmission and refraction angle images, and we describe how the beam hardening also affects the dark-field signal provided by specklemore » tracking. This study is particularly important for further implementations and developments of coherent speckle-based techniques at laboratory x-ray sources.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shu, Dewu; Xie, Xiaorong; Jiang, Qirong
With steady increase of power electronic devices and nonlinear dynamic loads in large scale AC/DC systems, the traditional hybrid simulation method, which incorporates these components into a single EMT subsystem and hence causes great difficulty for network partitioning and significant deterioration in simulation efficiency. To resolve these issues, a novel distributed hybrid simulation method is proposed in this paper. The key to realize this method is a distinct interfacing technique, which includes: i) a new approach based on the two-level Schur complement to update the interfaces by taking full consideration of the couplings between different EMT subsystems; and ii) amore » combined interaction protocol to further improve the efficiency while guaranteeing the simulation accuracy. The advantages of the proposed method in terms of both efficiency and accuracy have been verified by using it for the simulation study of an AC/DC hybrid system including a two-terminal VSC-HVDC and nonlinear dynamic loads.« less
Aerodynamic force measurement on a large-scale model in a short duration test facility
NASA Astrophysics Data System (ADS)
Tanno, H.; Kodera, M.; Komuro, T.; Sato, K.; Takahasi, M.; Itoh, K.
2005-03-01
A force measurement technique has been developed for large-scale aerodynamic models with a short test time. The technique is based on direct acceleration measurements, with miniature accelerometers mounted on a test model suspended by wires. Measuring acceleration at two different locations, the technique can eliminate oscillations from natural vibration of the model. The technique was used for drag force measurements on a 3m long supersonic combustor model in the HIEST free-piston driven shock tunnel. A time resolution of 350μs is guaranteed during measurements, whose resolution is enough for ms order test time in HIEST. To evaluate measurement reliability and accuracy, measured values were compared with results from a three-dimensional Navier-Stokes numerical simulation. The difference between measured values and numerical simulation values was less than 5%. We conclude that this measurement technique is sufficiently reliable for measuring aerodynamic force within test durations of 1ms.
Low Power LDPC Code Decoder Architecture Based on Intermediate Message Compression Technique
NASA Astrophysics Data System (ADS)
Shimizu, Kazunori; Togawa, Nozomu; Ikenaga, Takeshi; Goto, Satoshi
Reducing the power dissipation for LDPC code decoder is a major challenging task to apply it to the practical digital communication systems. In this paper, we propose a low power LDPC code decoder architecture based on an intermediate message-compression technique which features as follows: (i) An intermediate message compression technique enables the decoder to reduce the required memory capacity and write power dissipation. (ii) A clock gated shift register based intermediate message memory architecture enables the decoder to decompress the compressed messages in a single clock cycle while reducing the read power dissipation. The combination of the above two techniques enables the decoder to reduce the power dissipation while keeping the decoding throughput. The simulation results show that the proposed architecture improves the power efficiency up to 52% and 18% compared to that of the decoder based on the overlapped schedule and the rapid convergence schedule without the proposed techniques respectively.
Method for distributed agent-based non-expert simulation of manufacturing process behavior
Ivezic, Nenad; Potok, Thomas E.
2004-11-30
A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.
NASA Astrophysics Data System (ADS)
Pavlov, Al. A.; Shevchenko, A. M.; Khotyanovsky, D. V.; Pavlov, A. A.; Shmakov, A. S.; Golubev, M. P.
2017-10-01
We present a method for and results of determination of the field of integral density in the structure of flow corresponding to the Mach interaction of shock waves at Mach number M = 3. The optical diagnostics of flow was performed using an interference technique based on self-adjusting Zernike filters (SA-AVT method). Numerical simulations were carried out using the CFS3D program package for solving the Euler and Navier-Stokes equations. Quantitative data on the distribution of integral density on the path of probing radiation in one direction of 3D flow transillumination in the region of Mach interaction of shock waves were obtained for the first time.
A comparison of solute-transport solution techniques based on inverse modelling results
Mehl, S.; Hill, M.C.
2000-01-01
Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.
Virtual reality-based simulation training for ventriculostomy: an evidence-based approach.
Schirmer, Clemens M; Elder, J Bradley; Roitberg, Ben; Lobel, Darlene A
2013-10-01
Virtual reality (VR) simulation-based technologies play an important role in neurosurgical resident training. The Congress of Neurological Surgeons (CNS) Simulation Committee developed a simulation-based curriculum incorporating VR simulators to train residents in the management of common neurosurgical disorders. To enhance neurosurgical resident training for ventriculostomy placement using simulation-based training. A course-based neurosurgical simulation curriculum was introduced at the Neurosurgical Simulation Symposium at the 2011 and 2012 CNS annual meetings. A trauma module was developed to teach ventriculostomy placement as one of the neurosurgical procedures commonly performed in the management of traumatic brain injury. The course offered both didactic and simulator-based instruction, incorporating written and practical pretests and posttests and questionnaires to assess improvement in skill level and to validate the simulators as teaching tools. Fourteen trainees participated in the didactic component of the trauma module. Written scores improved significantly from pretest (75%) to posttest (87.5%; P < .05). Seven participants completed the ventriculostomy simulation. Significant improvements were observed in anatomy (P < .04), burr hole placement (P < .03), final location of the catheter (P = .05), and procedure completion time (P < .004). Senior residents planned a significantly better trajectory (P < .01); junior participants improved most in terms of identifying the relevant anatomy (P < .03) and the time required to complete the procedure (P < .04). VR ventriculostomy placement as part of the CNS simulation trauma module complements standard training techniques for residents in the management of neurosurgical trauma. Improvement in didactic and hands-on knowledge by course participants demonstrates the usefulness of the VR simulator as a training tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Demeure, I.M.
The research presented here is concerned with representation techniques and tools to support the design, prototyping, simulation, and evaluation of message-based parallel, distributed computations. The author describes ParaDiGM-Parallel, Distributed computation Graph Model-a visual representation technique for parallel, message-based distributed computations. ParaDiGM provides several views of a computation depending on the aspect of concern. It is made of two complementary submodels, the DCPG-Distributed Computing Precedence Graph-model, and the PAM-Process Architecture Model-model. DCPGs are precedence graphs used to express the functionality of a computation in terms of tasks, message-passing, and data. PAM graphs are used to represent the partitioning of a computationmore » into schedulable units or processes, and the pattern of communication among those units. There is a natural mapping between the two models. He illustrates the utility of ParaDiGM as a representation technique by applying it to various computations (e.g., an adaptive global optimization algorithm, the client-server model). ParaDiGM representations are concise. They can be used in documenting the design and the implementation of parallel, distributed computations, in describing such computations to colleagues, and in comparing and contrasting various implementations of the same computation. He then describes VISA-VISual Assistant, a software tool to support the design, prototyping, and simulation of message-based parallel, distributed computations. VISA is based on the ParaDiGM model. In particular, it supports the editing of ParaDiGM graphs to describe the computations of interest, and the animation of these graphs to provide visual feedback during simulations. The graphs are supplemented with various attributes, simulation parameters, and interpretations which are procedures that can be executed by VISA.« less
Simulators IV; Proceedings of the SCS Conference, Orlando, FL, Apr. 6-9, 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fairchild, B.T.
1987-01-01
The conference presents papers on the applicability of AI techniques to simulation models, the simulation of a reentry vehicle on Simstar, simstar missile simulation, measurement issues associated with simulator sickness, and tracing the etiology of simulator sickness. Consideration is given to a simulator of a steam generator tube bundle response to a blowdown transient, the census of simulators for fossil fueled boiler and gas turbine plant operation training, and a new approach for flight simulator visual systems. Other topics include past and present simulated aircraft maintenance trainers, an AI-simulation based approach for aircraft maintenance training, simulator qualification using EPRI methodology,more » and the role of instinct in organizational dysfunction.« less
Singh, Karandeep; Ahn, Chang-Won; Paik, Euihyun; Bae, Jang Won; Lee, Chun-Hee
2018-01-01
Artificial life (ALife) examines systems related to natural life, its processes, and its evolution, using simulations with computer models, robotics, and biochemistry. In this article, we focus on the computer modeling, or "soft," aspects of ALife and prepare a framework for scientists and modelers to be able to support such experiments. The framework is designed and built to be a parallel as well as distributed agent-based modeling environment, and does not require end users to have expertise in parallel or distributed computing. Furthermore, we use this framework to implement a hybrid model using microsimulation and agent-based modeling techniques to generate an artificial society. We leverage this artificial society to simulate and analyze population dynamics using Korean population census data. The agents in this model derive their decisional behaviors from real data (microsimulation feature) and interact among themselves (agent-based modeling feature) to proceed in the simulation. The behaviors, interactions, and social scenarios of the agents are varied to perform an analysis of population dynamics. We also estimate the future cost of pension policies based on the future population structure of the artificial society. The proposed framework and model demonstrates how ALife techniques can be used by researchers in relation to social issues and policies.
Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers
NASA Technical Reports Server (NTRS)
Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan
2001-01-01
The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.
Fine-resolution imaging of solar features using Phase-Diverse Speckle
NASA Technical Reports Server (NTRS)
Paxman, Richard G.
1995-01-01
Phase-diverse speckle (PDS) is a novel imaging technique intended to overcome the degrading effects of atmospheric turbulence on fine-resolution imaging. As its name suggests, PDS is a blend of phase-diversity and speckle-imaging concepts. PDS reconstructions on solar data were validated by simulation, by demonstrating internal consistency of PDS estimates, and by comparing PDS reconstructions with those produced from well accepted speckle-imaging processing. Several sources of error in data collected with the Swedish Vacuum Solar Telescope (SVST) were simulated: CCD noise, quantization error, image misalignment, and defocus error, as well as atmospheric turbulence model error. The simulations demonstrate that fine-resolution information can be reliably recovered out to at least 70% of the diffraction limit without significant introduction of image artifacts. Additional confidence in the SVST restoration is obtained by comparing its spatial power spectrum with previously-published power spectra derived from both space-based images and earth-based images corrected with traditional speckle-imaging techniques; the shape of the spectrum is found to match well the previous measurements. In addition, the imagery is found to be consistent with, but slightly sharper than, imagery reconstructed with accepted speckle-imaging techniques.
Fortuna, A O; Gurd, J R
1999-01-01
During certain medical procedures, it is important to continuously measure the respiratory flow of a patient, as lack of proper ventilation can cause brain damage and ultimately death. The monitoring of the ventilatory condition of a patient is usually performed with the aid of flowmeters. However, water and other secretions present in the expired air can build up and ultimately block a traditional, restriction-based flowmeter; by using an orifice plate flowmeter, such blockages are minimized. This paper describes the design of an orifice plate flowmetering system including, especially, a description of the numerical and computational techniques adopted in order to simulate human respiratory and sinusoidal air flow across various possible designs for the orifice plate flowmeter device. Parallel computation and multigrid techniques were employed in order to reduce execution time. The simulated orifice plate was later built and tested under unsteady sinusoidal flows. Experimental tests show reasonable agreement with the numerical simulation, thereby reinforcing the general hypothesis that computational exploration of the design space is sufficiently accurate to allow designers of such systems to use this in preference to the more traditional, mechanical prototyping techniques.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Discrete event simulation: the preferred technique for health economic evaluations?
Caro, Jaime J; Möller, Jörgen; Getsios, Denis
2010-12-01
To argue that discrete event simulation should be preferred to cohort Markov models for economic evaluations in health care. The basis for the modeling techniques is reviewed. For many health-care decisions, existing data are insufficient to fully inform them, necessitating the use of modeling to estimate the consequences that are relevant to decision-makers. These models must reflect what is known about the problem at a level of detail sufficient to inform the questions. Oversimplification will result in estimates that are not only inaccurate, but potentially misleading. Markov cohort models, though currently popular, have so many limitations and inherent assumptions that they are inadequate to inform most health-care decisions. An event-based individual simulation offers an alternative much better suited to the problem. A properly designed discrete event simulation provides more accurate, relevant estimates without being computationally prohibitive. It does require more data and may be a challenge to convey transparently, but these are necessary trade-offs to provide meaningful and valid results. In our opinion, discrete event simulation should be the preferred technique for health economic evaluations today. © 2010, International Society for Pharmacoeconomics and Outcomes Research (ISPOR).
NASA Technical Reports Server (NTRS)
Boyalakuntla, Kishore; Soni, Bharat K.; Thornburg, Hugh J.; Yu, Robert
1996-01-01
During the past decade, computational simulation of fluid flow around complex configurations has progressed significantly and many notable successes have been reported, however, unsteady time-dependent solutions are not easily obtainable. The present effort involves unsteady time dependent simulation of temporally deforming geometries. Grid generation for a complex configuration can be a time consuming process and temporally varying geometries necessitate the regeneration of such grids for every time step. Traditional grid generation techniques have been tried and demonstrated to be inadequate to such simulations. Non-Uniform Rational B-splines (NURBS) based techniques provide a compact and accurate representation of the geometry. This definition can be coupled with a distribution mesh for a user defined spacing. The present method greatly reduces cpu requirements for time dependent remeshing, facilitating the simulation of more complex unsteady problems. A thrust vectoring nozzle has been chosen to demonstrate the capability as it is of current interest in the aerospace industry for better maneuverability of fighter aircraft in close combat and in post stall regimes. This current effort is the first step towards multidisciplinary design optimization which involves coupling the aerodynamic heat transfer and structural analysis techniques. Applications include simulation of temporally deforming bodies and aeroelastic problems.
A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.
Agent-Based Modeling in Systems Pharmacology.
Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M
2015-11-01
Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.
NASA Technical Reports Server (NTRS)
Hinton, David A.
1989-01-01
Numerous air carrier accidents and incidents result from encounters with the atmospheric wind shear associated with microburst phenomena, in some cases resulting in heavy loss of life. An important issue in current wind shear research is how to best manage aircraft performance during an inadvertent wind shear encounter. The goals of this study were to: (1) develop techniques and guidance for maximizing an aircraft's ability to recover from microburst encounters following takeoff, (2) develop an understanding of how theoretical predictions of wind shear recovery performance might be achieved in actual use, and (3) gain insight into the piloting factors associated with recovery from microburst encounters. Three recovery strategies were implemented and tested in piloted simulation. Results show that a recovery strategy based on flying a flight path angle schedule produces improved performance over constant pitch attitude or acceleration-based recovery techniques. The best recovery technique was initially counterintuitive to the pilots who participated in the study. Evidence was found to indicate that the techniques required for flight through the turbulent vortex of a microburst may differ from the techniques being developed using classical, nonturbulent microburst models.
Accelerated GPU based SPECT Monte Carlo simulations.
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-07
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.
Simulation of a Petri net-based model of the terpenoid biosynthesis pathway.
Hawari, Aliah Hazmah; Mohamed-Hussein, Zeti-Azura
2010-02-09
The development and simulation of dynamic models of terpenoid biosynthesis has yielded a systems perspective that provides new insights into how the structure of this biochemical pathway affects compound synthesis. These insights may eventually help identify reactions that could be experimentally manipulated to amplify terpenoid production. In this study, a dynamic model of the terpenoid biosynthesis pathway was constructed based on the Hybrid Functional Petri Net (HFPN) technique. This technique is a fusion of three other extended Petri net techniques, namely Hybrid Petri Net (HPN), Dynamic Petri Net (HDN) and Functional Petri Net (FPN). The biological data needed to construct the terpenoid metabolic model were gathered from the literature and from biological databases. These data were used as building blocks to create an HFPNe model and to generate parameters that govern the global behaviour of the model. The dynamic model was simulated and validated against known experimental data obtained from extensive literature searches. The model successfully simulated metabolite concentration changes over time (pt) and the observations correlated with known data. Interactions between the intermediates that affect the production of terpenes could be observed through the introduction of inhibitors that established feedback loops within and crosstalk between the pathways. Although this metabolic model is only preliminary, it will provide a platform for analysing various high-throughput data, and it should lead to a more holistic understanding of terpenoid biosynthesis.
Almazyad, Abdulaziz S.; Seddiq, Yasser M.; Alotaibi, Ahmed M.; Al-Nasheri, Ahmed Y.; BenSaleh, Mohammed S.; Obeid, Abdulfattah M.; Qasim, Syed Manzoor
2014-01-01
Anomalies such as leakage and bursts in water pipelines have severe consequences for the environment and the economy. To ensure the reliability of water pipelines, they must be monitored effectively. Wireless Sensor Networks (WSNs) have emerged as an effective technology for monitoring critical infrastructure such as water, oil and gas pipelines. In this paper, we present a scalable design and simulation of a water pipeline leakage monitoring system using Radio Frequency IDentification (RFID) and WSN technology. The proposed design targets long-distance aboveground water pipelines that have special considerations for maintenance, energy consumption and cost. The design is based on deploying a group of mobile wireless sensor nodes inside the pipeline and allowing them to work cooperatively according to a prescheduled order. Under this mechanism, only one node is active at a time, while the other nodes are sleeping. The node whose turn is next wakes up according to one of three wakeup techniques: location-based, time-based and interrupt-driven. In this paper, mathematical models are derived for each technique to estimate the corresponding energy consumption and memory size requirements. The proposed equations are analyzed and the results are validated using simulation. PMID:24561404
Almazyad, Abdulaziz S; Seddiq, Yasser M; Alotaibi, Ahmed M; Al-Nasheri, Ahmed Y; BenSaleh, Mohammed S; Obeid, Abdulfattah M; Qasim, Syed Manzoor
2014-02-20
Anomalies such as leakage and bursts in water pipelines have severe consequences for the environment and the economy. To ensure the reliability of water pipelines, they must be monitored effectively. Wireless Sensor Networks (WSNs) have emerged as an effective technology for monitoring critical infrastructure such as water, oil and gas pipelines. In this paper, we present a scalable design and simulation of a water pipeline leakage monitoring system using Radio Frequency IDentification (RFID) and WSN technology. The proposed design targets long-distance aboveground water pipelines that have special considerations for maintenance, energy consumption and cost. The design is based on deploying a group of mobile wireless sensor nodes inside the pipeline and allowing them to work cooperatively according to a prescheduled order. Under this mechanism, only one node is active at a time, while the other nodes are sleeping. The node whose turn is next wakes up according to one of three wakeup techniques: location-based, time-based and interrupt-driven. In this paper, mathematical models are derived for each technique to estimate the corresponding energy consumption and memory size requirements. The proposed equations are analyzed and the results are validated using simulation.
ANALYSIS OF METHODS FOR DETECTING THE PROXIMITY EFFECT IN QUASAR SPECTRA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dall'Aglio, Aldo; Gnedin, Nickolay Y., E-mail: adaglio@aip.d
Using numerical simulations of structure formation, we investigate several methods for determining the strength of the proximity effect in the H I Ly{alpha} forest. We analyze three high-resolution ({approx}10 kpc) redshift snapshots (z-bar=4,3, and 2.25) of a Hydro-Particle-Mesh simulation to obtain realistic absorption spectra of the H I Ly{alpha} forest. We model the proximity effect along the simulated sight lines with a simple analytical prescription based on the assumed quasar luminosity and the intensity of the cosmic UV background (UVB). We begin our analysis investigating the intrinsic biases thought to arise in the widely adopted standard technique of combining multiplemore » lines of sight when searching for the proximity effect. We confirm the existence of these biases, albeit smaller than previously predicted with simple Monte Carlo simulations. We then concentrate on the analysis of the proximity effect along individual lines of sight. After determining its strength with a fiducial value of the UVB intensity, we construct the proximity effect strength distribution (PESD). We confirm that the PESD inferred from the simple averaging technique accurately recovers the input strength of the proximity effect at all redshifts. Moreover, the PESD closely follows the behaviors found in observed samples of quasar spectra. However, the PESD obtained from our new simulated sight lines presents some differences to that of simple Monte Carlo simulations. At all redshifts, we find a smaller dispersion of the strength parameters, the source of the corresponding smaller biases found when combining multiple lines of sight. After developing three new theoretical methods for recovering the strength of the proximity effect on individual lines of sight, we compare their accuracy to the PESD from the simple averaging technique. All our new approaches are based on the maximization of the likelihood function, albeit invoking some modifications. The new techniques presented here, in spite of their complexity, fail to recover the input proximity effect in an unbiased way, presumably due to some (unknown) higher order correlations in the spectrum. Thus, employing complex three-dimensional simulations, we provide strong evidence in favor of the PESD obtained from the simple averaging technique, as a method of estimating the UVB intensity, free of any intrinsic biases.« less
A serious game for learning ultrasound-guided needle placement skills.
Chan, Wing-Yin; Qin, Jing; Chui, Yim-Pan; Heng, Pheng-Ann
2012-11-01
Ultrasound-guided needle placement is a key step in a lot of radiological intervention procedures such as biopsy, local anesthesia and fluid drainage. To help training future intervention radiologists, we develop a serious game to teach the skills involved. We introduce novel techniques for realistic simulation and integrate game elements for active and effective learning. This game is designed in the context of needle placement training based on the some essential characteristics of serious games. Training scenarios are interactively generated via a block-based construction scheme. A novel example-based texture synthesis technique is proposed to simulate corresponding ultrasound images. Game levels are defined based on the difficulties of the generated scenarios. Interactive recommendation of desirable insertion paths is provided during the training as an adaptation mechanism. We also develop a fast physics-based approach to reproduce the shadowing effect of needles in ultrasound images. Game elements such as time-attack tasks, hints and performance evaluation tools are also integrated in our system. Extensive experiments are performed to validate its feasibility for training.
Chronology of DIC technique based on the fundamental mathematical modeling and dehydration impact.
Alias, Norma; Saipol, Hafizah Farhah Saipan; Ghani, Asnida Che Abd
2014-12-01
A chronology of mathematical models for heat and mass transfer equation is proposed for the prediction of moisture and temperature behavior during drying using DIC (Détente Instantanée Contrôlée) or instant controlled pressure drop technique. DIC technique has the potential as most commonly used dehydration method for high impact food value including the nutrition maintenance and the best possible quality for food storage. The model is governed by the regression model, followed by 2D Fick's and Fourier's parabolic equation and 2D elliptic-parabolic equation in a rectangular slice. The models neglect the effect of shrinkage and radiation effects. The simulations of heat and mass transfer equations with parabolic and elliptic-parabolic types through some numerical methods based on finite difference method (FDM) have been illustrated. Intel®Core™2Duo processors with Linux operating system and C programming language have been considered as a computational platform for the simulation. Qualitative and quantitative differences between DIC technique and the conventional drying methods have been shown as a comparative.
Simulation tools for guided wave based structural health monitoring
NASA Astrophysics Data System (ADS)
Mesnil, Olivier; Imperiale, Alexandre; Demaldent, Edouard; Baronian, Vahan; Chapuis, Bastien
2018-04-01
Structural Health Monitoring (SHM) is a thematic derived from Non Destructive Evaluation (NDE) based on the integration of sensors onto or into a structure in order to monitor its health without disturbing its regular operating cycle. Guided wave based SHM relies on the propagation of guided waves in plate-like or extruded structures. Using piezoelectric transducers to generate and receive guided waves is one of the most widely accepted paradigms due to the low cost and low weight of those sensors. A wide range of techniques for flaw detection based on the aforementioned setup is available in the literature but very few of these techniques have found industrial applications yet. A major difficulty comes from the sensitivity of guided waves to a substantial number of parameters such as the temperature or geometrical singularities, making guided wave measurement difficult to analyze. In order to apply guided wave based SHM techniques to a wider spectrum of applications and to transfer those techniques to the industry, the CEA LIST develops novel numerical methods. These methods facilitate the evaluation of the robustness of SHM techniques for multiple applicative cases and ease the analysis of the influence of various parameters, such as sensors positioning or environmental conditions. The first numerical tool is the guided wave module integrated to the commercial software CIVA, relying on a hybrid modal-finite element formulation to compute the guided wave response of perturbations (cavities, flaws…) in extruded structures of arbitrary cross section such as rails or pipes. The second numerical tool is based on the spectral element method [2] and simulates guided waves in both isotropic (metals) and orthotropic (composites) plate like-structures. This tool is designed to match the widely accepted sparse piezoelectric transducer array SHM configuration in which each embedded sensor acts as both emitter and receiver of guided waves. This tool is under development and will be adapted to simulate complex real-life structures such as curved composite panels with stiffeners. This communication will present these numerical tools and their main functionalities.
Comparison of Phase-Based 3D Near-Field Source Localization Techniques for UHF RFID.
Parr, Andreas; Miesen, Robert; Vossiek, Martin
2016-06-25
In this paper, we present multiple techniques for phase-based narrowband backscatter tag localization in three-dimensional space with planar antenna arrays or synthetic apertures. Beamformer and MUSIC localization algorithms, known from near-field source localization and direction-of-arrival estimation, are applied to the 3D backscatter scenario and their performance in terms of localization accuracy is evaluated. We discuss the impact of different transceiver modes known from the literature, which evaluate different send and receive antenna path combinations for a single localization, as in multiple input multiple output (MIMO) systems. Furthermore, we propose a new Singledimensional-MIMO (S-MIMO) transceiver mode, which is especially suited for use with mobile robot systems. Monte-Carlo simulations based on a realistic multipath error model ensure spatial correlation of the simulated signals, and serve to critically appraise the accuracies of the different localization approaches. A synthetic uniform rectangular array created by a robotic arm is used to evaluate selected localization techniques. We use an Ultra High Frequency (UHF) Radiofrequency Identification (RFID) setup to compare measurements with the theory and simulation. The results show how a mean localization accuracy of less than 30 cm can be reached in an indoor environment. Further simulations demonstrate how the distance between aperture and tag affects the localization accuracy and how the size and grid spacing of the rectangular array need to be adapted to improve the localization accuracy down to orders of magnitude in the centimeter range, and to maximize array efficiency in terms of localization accuracy per number of elements.
NASA Astrophysics Data System (ADS)
Akhlaghi, H.; Roohi, E.; Myong, R. S.
2012-11-01
Micro/nano geometries with specified wall heat flux are widely encountered in electronic cooling and micro-/nano-fluidic sensors. We introduce a new technique to impose the desired (positive/negative) wall heat flux boundary condition in the DSMC simulations. This technique is based on an iterative progress on the wall temperature magnitude. It is found that the proposed iterative technique has a good numerical performance and could implement both positive and negative values of wall heat flux rates accurately. Using present technique, rarefied gas flow through micro-/nanochannels under specified wall heat flux conditions is simulated and unique behaviors are observed in case of channels with cooling walls. For example, contrary to the heating process, it is observed that cooling of micro/nanochannel walls would result in small variations in the density field. Upstream thermal creep effects in the cooling process decrease the velocity slip despite of the Knudsen number increase along the channel. Similarly, cooling process decreases the curvature of the pressure distribution below the linear incompressible distribution. Our results indicate that flow cooling increases the mass flow rate through the channel, and vice versa.
Trends and Techniques for Space Base Electronics
NASA Technical Reports Server (NTRS)
Trotter, J. D.; Wade, T. E.; Gassaway, J. D.
1979-01-01
Simulations of various phosphorus and boron diffusions in SOS were completed and a sputtering system, furnaces, and photolithography related equipment were set up. Double layer metal experiments initially utilized wet chemistry techniques. By incorporating ultrasonic etching of the vias, premetal cleaning a modified buffered HF, phosphorus doped vapox, and extended sintering, yields of 98% were obtained using the standard test pattern. A two dimensional modeling program was written for simulating short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide silicon interface. Although the program is incomplete, the two dimensional Poisson equation for the potential distribution was achieved. The status of other Z-D MOSFET simulation programs is summarized.
NASA Technical Reports Server (NTRS)
Middleton, D. B.; Hurt, G. J., Jr.
1971-01-01
A fixed-base piloted simulator investigation has been made of the feasibility of using any of several manual guidance and control techniques for emergency lunar escape to orbit with very simplified, lightweight vehicle systems. The escape-to-orbit vehicles accommodate two men, but one man performs all of the guidance and control functions. Three basic attitude-control modes and four manually executed trajectory-guidance schemes were used successfully during approximately 125 simulated flights under a variety of conditions. These conditions included thrust misalinement, uneven propellant drain, and a vehicle moment-of-inertia range of 250 to 12,000 slugs per square foot. Two types of results are presented - orbit characteristics and pilot ratings of vehicle handling qualities.
NASA Astrophysics Data System (ADS)
Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad
2016-11-01
The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.
NASA Technical Reports Server (NTRS)
Edwards, Jack R.; Mcrae, D. Scott
1991-01-01
An efficient method for computing two-dimensional compressible Navier-Stokes flow fields is presented. The solution algorithm is a fully-implicit approximate factorization technique based on an unsymmetric line Gauss-Seidel splitting of the equation system Jacobian matrix. Convergence characteristics are improved by the addition of acceleration techniques based on Shamanskii's method for nonlinear equations and Broyden's quasi-Newton update. Characteristic-based differencing of the equations is provided by means of Van Leer's flux vector splitting. In this investigation, emphasis is placed on the fast and accurate computation of shock-wave-boundary layer interactions with and without slot suction effects. In the latter context, a set of numerical boundary conditions for simulating the transpiration flow in an open slot is devised. Both laminar and turbulent cases are considered, with turbulent closure provided by a modified Cebeci-Smith algebraic model. Comparisons with computational and experimental data sets are presented for a variety of interactions, and a fully-coupled simulation of a plenum chamber/inlet flowfield with shock interaction and suction is also shown and discussed.
Predicting patchy particle crystals: variable box shape simulations and evolutionary algorithms.
Bianchi, Emanuela; Doppelbauer, Günther; Filion, Laura; Dijkstra, Marjolein; Kahl, Gerhard
2012-06-07
We consider several patchy particle models that have been proposed in literature and we investigate their candidate crystal structures in a systematic way. We compare two different algorithms for predicting crystal structures: (i) an approach based on Monte Carlo simulations in the isobaric-isothermal ensemble and (ii) an optimization technique based on ideas of evolutionary algorithms. We show that the two methods are equally successful and provide consistent results on crystalline phases of patchy particle systems.
Qin, J; Choi, K S; Ho, Simon S M; Heng, P A
2008-01-01
A force prediction algorithm is proposed to facilitate virtual-reality (VR) based collaborative surgical simulation by reducing the effect of network latencies. State regeneration is used to correct the estimated prediction. This algorithm is incorporated into an adaptive transmission protocol in which auxiliary features such as view synchronization and coupling control are equipped to ensure the system consistency. We implemented this protocol using multi-threaded technique on a cluster-based network architecture.
The effect of fidelity: how expert behavior changes in a virtual reality environment.
Ioannou, Ioanna; Avery, Alex; Zhou, Yun; Szudek, Jacek; Kennedy, Gregor; O'Leary, Stephen
2014-09-01
We compare the behavior of expert surgeons operating on the "gold standard" of simulation-the cadaveric temporal bone-against a high-fidelity virtual reality (VR) simulation. We aim to determine whether expert behavior changes within the virtual environment and to understand how the fidelity of simulation affects users' behavior. Five expert otologists performed cortical mastoidectomy and cochleostomy on a human cadaveric temporal bone and a VR temporal bone simulator. Hand movement and video recordings were used to derive a range of measures, to facilitate an analysis of surgical technique, and to compare expert behavior between the cadaveric and simulator environments. Drilling time was similar across the two environments. Some measures such as total time and burr change count differed predictably due to the ease of switching burrs within the simulator. Surgical strokes were generally longer in distance and duration in VR, but these measures changed proportionally to cadaveric measures across the stages of the procedure. Stroke shape metrics differed, which was attributed to the modeling of burr behavior within the simulator. This will be corrected in future versions. Slight differences in drill interaction between a virtual environment and the real world can have measurable effects on surgical technique, particularly in terms of stroke length, duration, and curvature. It is important to understand these effects when designing and implementing surgical training programs based on VR simulation--and when improving the fidelity of VR simulators to facilitate use of a similar technique in both real and simulated situations. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
Karain, Wael I
2017-11-28
Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.
NASA Lunar Regolith Simulant Program
NASA Technical Reports Server (NTRS)
Edmunson, J.; Betts, W.; Rickman, D.; McLemore, C.; Fikes, J.; Stoeser, D.; Wilson, S.; Schrader, C.
2010-01-01
Lunar regolith simulant production is absolutely critical to returning man to the Moon. Regolith simulant is used to test hardware exposed to the lunar surface environment, simulate health risks to astronauts, practice in situ resource utilization (ISRU) techniques, and evaluate dust mitigation strategies. Lunar regolith simulant design, production process, and management is a cooperative venture between members of the NASA Marshall Space Flight Center (MSFC) and the U.S. Geological Survey (USGS). The MSFC simulant team is a satellite of the Dust group based at Glenn Research Center. The goals of the cooperative group are to (1) reproduce characteristics of lunar regolith using simulants, (2) produce simulants as cheaply as possible, (3) produce simulants in the amount needed, and (4) produce simulants to meet users? schedules.
ERIC Educational Resources Information Center
Weeber, Marc; Klein, Henny; de Jong-van den Berg, Lolkje T. W.; Vos, Rein
2001-01-01
Proposes a two-step model of discovery in which new scientific hypotheses can be generated and subsequently tested. Applying advanced natural language processing techniques to find biomedical concepts in text, the model is implemented in a versatile interactive discovery support tool. This tool is used to successfully simulate Don R. Swanson's…
Floating-point system quantization errors in digital control systems
NASA Technical Reports Server (NTRS)
Phillips, C. L.; Vallely, D. P.
1978-01-01
This paper considers digital controllers (filters) operating in floating-point arithmetic in either open-loop or closed-loop systems. A quantization error analysis technique is developed, and is implemented by a digital computer program that is based on a digital simulation of the system. The program can be integrated into existing digital simulations of a system.
Actinides in metallic waste from electrometallurgical treatment of spent nuclear fuel
NASA Astrophysics Data System (ADS)
Janney, D. E.; Keiser, D. D.
2003-09-01
Argonne National Laboratory has developed a pyroprocessing-based technique for conditioning spent sodium-bonded nuclear-reactor fuel in preparation for long-term disposal. The technique produces a metallic waste form whose nominal composition is stainless steel with 15 wt.% Zr (SS-15Zr), up to ˜ 11 wt.% actinide elements (primarily uranium), and a few percent metallic fission products. Actual and simulated waste forms show similar eutectic microstructures with approximately equal proportions of iron solid solution phases and Fe-Zr intermetallics. This article reports on an analysis of simulated waste forms containing uranium, neptunium, and plutonium.
On verifying a high-level design. [cost and error analysis
NASA Technical Reports Server (NTRS)
Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.
1993-01-01
An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.
A Systems Approach to Scalable Transportation Network Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumalla, Kalyan S
2006-01-01
Emerging needs in transportation network modeling and simulation are raising new challenges with respect to scal-ability of network size and vehicular traffic intensity, speed of simulation for simulation-based optimization, and fidel-ity of vehicular behavior for accurate capture of event phe-nomena. Parallel execution is warranted to sustain the re-quired detail, size and speed. However, few parallel simulators exist for such applications, partly due to the challenges underlying their development. Moreover, many simulators are based on time-stepped models, which can be computationally inefficient for the purposes of modeling evacuation traffic. Here an approach is presented to de-signing a simulator with memory andmore » speed efficiency as the goals from the outset, and, specifically, scalability via parallel execution. The design makes use of discrete event modeling techniques as well as parallel simulation meth-ods. Our simulator, called SCATTER, is being developed, incorporating such design considerations. Preliminary per-formance results are presented on benchmark road net-works, showing scalability to one million vehicles simu-lated on one processor.« less
NASA Astrophysics Data System (ADS)
Neuer, Marcus J.
2013-11-01
A technique for the spectral identification of strontium-90 is shown, utilising a Maximum-Likelihood deconvolution. Different deconvolution approaches are discussed and summarised. Based on the intensity distribution of the beta emission and Geant4 simulations, a combined response matrix is derived, tailored to the β- detection process in sodium iodide detectors. It includes scattering effects and attenuation by applying a base material decomposition extracted from Geant4 simulations with a CAD model for a realistic detector system. Inversion results of measurements show the agreement between deconvolution and reconstruction. A detailed investigation with additional masking sources like 40K, 226Ra and 131I shows that a contamination of strontium can be found in the presence of these nuisance sources. Identification algorithms for strontium are presented based on the derived technique. For the implementation of blind identification, an exemplary masking ratio is calculated.
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Agent-based modeling: a new approach for theory building in social psychology.
Smith, Eliot R; Conrey, Frederica R
2007-02-01
Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.
Web-Based Computational Chemistry Education with CHARMMing I: Lessons and Tutorial
Miller, Benjamin T.; Singh, Rishi P.; Schalk, Vinushka; Pevzner, Yuri; Sun, Jingjun; Miller, Carrie S.; Boresch, Stefan; Ichiye, Toshiko; Brooks, Bernard R.; Woodcock, H. Lee
2014-01-01
This article describes the development, implementation, and use of web-based “lessons” to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing) web user interface (http://www.charmming.org). Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets), allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that “point and click” simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance. PMID:25057988
Web-based computational chemistry education with CHARMMing I: Lessons and tutorial.
Miller, Benjamin T; Singh, Rishi P; Schalk, Vinushka; Pevzner, Yuri; Sun, Jingjun; Miller, Carrie S; Boresch, Stefan; Ichiye, Toshiko; Brooks, Bernard R; Woodcock, H Lee
2014-07-01
This article describes the development, implementation, and use of web-based "lessons" to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing) web user interface (http://www.charmming.org). Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets), allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that "point and click" simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance.
Modeling and simulation of dust behaviors behind a moving vehicle
NASA Astrophysics Data System (ADS)
Wang, Jingfang
Simulation of physically realistic complex dust behaviors is a difficult and attractive problem in computer graphics. A fast, interactive and visually convincing model of dust behaviors behind moving vehicles is very useful in computer simulation, training, education, art, advertising, and entertainment. In my dissertation, an experimental interactive system has been implemented for the simulation of dust behaviors behind moving vehicles. The system includes physically-based models, particle systems, rendering engines and graphical user interface (GUI). I have employed several vehicle models including tanks, cars, and jeeps to test and simulate in different scenarios and conditions. Calm weather, winding condition, vehicle turning left or right, and vehicle simulation controlled by users from the GUI are all included. I have also tested the factors which play against the physical behaviors and graphics appearances of the dust particles through GUI or off-line scripts. The simulations are done on a Silicon Graphics Octane station. The animation of dust behaviors is achieved by physically-based modeling and simulation. The flow around a moving vehicle is modeled using computational fluid dynamics (CFD) techniques. I implement a primitive variable and pressure-correction approach to solve the three dimensional incompressible Navier Stokes equations in a volume covering the moving vehicle. An alternating- direction implicit (ADI) method is used for the solution of the momentum equations, with a successive-over- relaxation (SOR) method for the solution of the Poisson pressure equation. Boundary conditions are defined and simplified according to their dynamic properties. The dust particle dynamics is modeled using particle systems, statistics, and procedure modeling techniques. Graphics and real-time simulation techniques, such as dynamics synchronization, motion blur, blending, and clipping have been employed in the rendering to achieve realistic appearing dust behaviors. In addition, I introduce a temporal smoothing technique to eliminate the jagged effect caused by large simulation time. Several algorithms are used to speed up the simulation. For example, pre-calculated tables and display lists are created to replace some of the most commonly used functions, scripts and processes. The performance study shows that both time and space costs of the algorithms are linear in the number of particles in the system. On a Silicon Graphics Octane, three vehicles with 20,000 particles run at 6-8 frames per second on average. This speed does not include the extra calculations of convergence of the numerical integration for fluid dynamics which usually takes about 4-5 minutes to achieve steady state.
Transcranial phase aberration correction using beam simulations and MR-ARFI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vyas, Urvi, E-mail: urvi.vyas@gmail.com; Kaye, Elena; Pauly, Kim Butts
2014-03-15
Purpose: Transcranial magnetic resonance-guided focused ultrasound surgery is a noninvasive technique for causing selective tissue necrosis. Variations in density, thickness, and shape of the skull cause aberrations in the location and shape of the focal zone. In this paper, the authors propose a hybrid simulation-MR-ARFI technique to achieve aberration correction for transcranial MR-guided focused ultrasound surgery. The technique uses ultrasound beam propagation simulations with MR Acoustic Radiation Force Imaging (MR-ARFI) to correct skull-caused phase aberrations. Methods: Skull-based numerical aberrations were obtained from a MR-guided focused ultrasound patient treatment and were added to all elements of the InSightec conformal bone focusedmore » ultrasound surgery transducer during transmission. In the first experiment, the 1024 aberrations derived from a human skull were condensed into 16 aberrations by averaging over the transducer area of 64 elements. In the second experiment, all 1024 aberrations were applied to the transducer. The aberrated MR-ARFI images were used in the hybrid simulation-MR-ARFI technique to find 16 estimated aberrations. These estimated aberrations were subtracted from the original aberrations to result in the corrected images. Each aberration experiment (16-aberration and 1024-aberration) was repeated three times. Results: The corrected MR-ARFI image was compared to the aberrated image and the ideal image (image with zero aberrations) for each experiment. The hybrid simulation-MR-ARFI technique resulted in an average increase in focal MR-ARFI phase of 44% for the 16-aberration case and 52% for the 1024-aberration case, and recovered 83% and 39% of the ideal MR-ARFI phase for the 16-aberrations and 1024-aberration case, respectively. Conclusions: Using one MR-ARFI image and noa priori information about the applied phase aberrations, the hybrid simulation-MR-ARFI technique improved the maximum MR-ARFI phase of the beam's focus.« less
Investigation of advanced phase-shifting projected fringe profilometry techniques
NASA Astrophysics Data System (ADS)
Liu, Hongyu
1999-11-01
The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.
Mapping and DOWNFLOW simulation of recent lava flow fields at Mount Etna
NASA Astrophysics Data System (ADS)
Tarquini, Simone; Favalli, Massimiliano
2011-07-01
In recent years, progress in geographic information systems (GIS) and remote sensing techniques have allowed the mapping and studying of lava flows in unprecedented detail. A composite GIS technique is introduced to obtain high resolution boundaries of lava flow fields. This technique is mainly based on the processing of LIDAR-derived maps and digital elevation models (DEMs). The probabilistic code DOWNFLOW is then used to simulate eight large flow fields formed at Mount Etna in the last 25 years. Thanks to the collection of 6 DEMs representing Mount Etna at different times from 1986 to 2007, simulated outputs are obtained by running the DOWNFLOW code over pre-emplacement topographies. Simulation outputs are compared with the boundaries of the actual flow fields obtained here or derived from the existing literature. Although the selected fields formed in accordance with different emplacement mechanisms, flowed on different zones of the volcano over different topographies and were fed by different lava supplies of different durations, DOWNFLOW yields results close to the actual flow fields in all the cases considered. This outcome is noteworthy because DOWNFLOW has been applied by adopting a default calibration, without any specific tuning for the new cases considered here. This extensive testing proves that, if the pre-emplacement topography is available, DOWNFLOW yields a realistic simulation of a future lava flow based solely on a knowledge of the vent position. In comparison with deterministic codes, which require accurate knowledge of a large number of input parameters, DOWNFLOW turns out to be simple, fast and undemanding, proving to be ideal for systematic hazard and risk analyses.
Validation techniques of agent based modelling for geospatial simulations
NASA Astrophysics Data System (ADS)
Darvishi, M.; Ahmadi, G.
2014-10-01
One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.
Scott, J; Botsis, T; Ball, R
2014-01-01
Spontaneous Reporting Systems [SRS] are critical tools in the post-licensure evaluation of medical product safety. Regulatory authorities use a variety of data mining techniques to detect potential safety signals in SRS databases. Assessing the performance of such signal detection procedures requires simulated SRS databases, but simulation strategies proposed to date each have limitations. We sought to develop a novel SRS simulation strategy based on plausible mechanisms for the growth of databases over time. We developed a simulation strategy based on the network principle of preferential attachment. We demonstrated how this strategy can be used to create simulations based on specific databases of interest, and provided an example of using such simulations to compare signal detection thresholds for a popular data mining algorithm. The preferential attachment simulations were generally structurally similar to our targeted SRS database, although they had fewer nodes of very high degree. The approach was able to generate signal-free SRS simulations, as well as mimicking specific known true signals. Explorations of different reporting thresholds for the FDA Vaccine Adverse Event Reporting System suggested that using proportional reporting ratio [PRR] > 3.0 may yield better signal detection operating characteristics than the more commonly used PRR > 2.0 threshold. The network analytic approach to SRS simulation based on the principle of preferential attachment provides an attractive framework for exploring the performance of safety signal detection algorithms. This approach is potentially more principled and versatile than existing simulation approaches. The utility of network-based SRS simulations needs to be further explored by evaluating other types of simulated signals with a broader range of data mining approaches, and comparing network-based simulations with other simulation strategies where applicable.
Hu, Fei; Cheng, Yayun; Gui, Liangqi; Wu, Liang; Zhang, Xinyi; Peng, Xiaohui; Su, Jinlong
2016-11-01
The polarization properties of thermal millimeter-wave emission capture inherent information of objects, e.g., material composition, shape, and surface features. In this paper, a polarization-based material-classification technique using passive millimeter-wave polarimetric imagery is presented. Linear polarization ratio (LPR) is created to be a new feature discriminator that is sensitive to material type and to remove the reflected ambient radiation effect. The LPR characteristics of several common natural and artificial materials are investigated by theoretical and experimental analysis. Based on a priori information about LPR characteristics, the optimal range of incident angle and the classification criterion are discussed. Simulation and measurement results indicate that the presented classification technique is effective for distinguishing between metals and dielectrics. This technique suggests possible applications for outdoor metal target detection in open scenes.
NanoDesign: Concepts and Software for a Nanotechnology Based on Functionalized Fullerenes
NASA Technical Reports Server (NTRS)
Globus, Al; Jaffe, Richard; Chancellor, Marisa K. (Technical Monitor)
1996-01-01
Eric Drexler has proposed a hypothetical nanotechnology based on diamond and investigated the properties of such molecular systems. While attractive, diamonoid nanotechnology is not physically accessible with straightforward extensions of current laboratory techniques. We propose a nanotechnology based on functionalized fullerenes and investigate carbon nanotube based gears with teeth added via a benzyne reaction known to occur with C60. The gears are single-walled carbon nanotubes with appended coenzyme groups for teeth. Fullerenes are in widespread laboratory use and can be functionalized in many ways. Companion papers computationally demonstrate the properties of these gears (they appear to work) and the accessibility of the benzyne/nanotube reaction. This paper describes the molecular design techniques and rationale as well as the software that implements these design techniques. The software is a set of persistent C++ objects controlled by TCL command scripts. The c++/tcl interface is automatically generated by a software system called tcl_c++ developed by the author and described here. The objects keep track of different portions of the molecular machinery to allow different simulation techniques and boundary conditions to be applied as appropriate. This capability has been required to demonstrate (computationally) our gear's feasibility. A new distributed software architecture featuring a WWW universal client, CORBA distributed objects, and agent software is under consideration. The software architecture is intended to eventually enable a widely disbursed group to develop complex simulated molecular machines.
NASA Technical Reports Server (NTRS)
Kavi, K. M.
1984-01-01
There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.
A data-driven dynamics simulation framework for railway vehicles
NASA Astrophysics Data System (ADS)
Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2018-03-01
The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.
Inductive System Health Monitoring
NASA Technical Reports Server (NTRS)
Iverson, David L.
2004-01-01
The Inductive Monitoring System (IMS) software was developed to provide a technique to automatically produce health monitoring knowledge bases for systems that are either difficult to model (simulate) with a computer or which require computer models that are too complex to use for real time monitoring. IMS uses nominal data sets collected either directly from the system or from simulations to build a knowledge base that can be used to detect anomalous behavior in the system. Machine learning and data mining techniques are used to characterize typical system behavior by extracting general classes of nominal data from archived data sets. IMS is able to monitor the system by comparing real time operational data with these classes. We present a description of learning and monitoring method used by IMS and summarize some recent IMS results.
Advanced sensor-simulation capability
NASA Astrophysics Data System (ADS)
Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.
1990-09-01
This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.
Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for fluid-particle flows
NASA Astrophysics Data System (ADS)
Kong, Bo; Patel, Ravi G.; Capecelatro, Jesse; Desjardins, Olivier; Fox, Rodney O.
2017-11-01
In this work, we study the performance of three simulation techniques for fluid-particle flows: (1) a volume-filtered Euler-Lagrange approach (EL), (2) a quadrature-based moment method using the anisotropic Gaussian closure (AG), and (3) a traditional two-fluid model. By simulating two problems: particles in frozen homogeneous isotropic turbulence (HIT), and cluster-induced turbulence (CIT), the convergence of the methods under grid refinement is found to depend on the simulation method and the specific problem, with CIT simulations facing fewer difficulties than HIT. Although EL converges under refinement for both HIT and CIT, its statistical results exhibit dependence on the techniques used to extract statistics for the particle phase. For HIT, converging both EE methods (TFM and AG) poses challenges, while for CIT, AG and EL produce similar results. Overall, all three methods face challenges when trying to extract converged, parameter-independent statistics due to the presence of shocks in the particle phase. National Science Foundation and National Energy Technology Laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shao, Lin; Gigax, Jonathan; Chen, Di
Self-ion irradiation is widely used as a method to simulate neutron damage in reactor structural materials. Accelerator-based simulation of void swelling, however, introduces a number of neutron-atypical features which require careful data extraction and in some cases introduction of innovative irradiation techniques to alleviate these issues. We briefly summarize three such atypical features: defect imbalance effects, pulsed beam effects, and carbon contamination. The latter issue has just been recently recognized as being relevant to simulation of void swelling and is discussed here in greater detail. It is shown that carbon ions are entrained in the ion beam by Coulomb forcemore » drag and accelerated toward the target surface. Beam-contaminant interactions are modeled using molecular dynamics simulation. By applying a multiple beam deflection technique, carbon and other contaminants can be effectively filtered out, as demonstrated in an irradiation of HT-9 alloy by 3.5 MeV Fe ions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shao, Lin; Gigax, Jonathan; Chen, Di
Self-ion irradiation is widely used as a method to simulate neutron damage in reactor structural materials. Accelerator-based simulation of void swelling, however, introduces a number of neutron-atypical features which require careful data extraction and, in some cases, introduction of innovative irradiation techniques to alleviate these issues. In this paper, we briefly summarize three such atypical features: defect imbalance effects, pulsed beam effects, and carbon contamination. The latter issue has just been recently recognized as being relevant to simulation of void swelling and is discussed here in greater detail. It is shown that carbon ions are entrained in the ion beammore » by Coulomb force drag and accelerated toward the target surface. Beam-contaminant interactions are modeled using molecular dynamics simulation. Finally, by applying a multiple beam deflection technique, carbon and other contaminants can be effectively filtered out, as demonstrated in an irradiation of HT-9 alloy by 3.5 MeV Fe ions.« less
Shao, Lin; Gigax, Jonathan; Chen, Di; ...
2017-06-12
Self-ion irradiation is widely used as a method to simulate neutron damage in reactor structural materials. Accelerator-based simulation of void swelling, however, introduces a number of neutron-atypical features which require careful data extraction and, in some cases, introduction of innovative irradiation techniques to alleviate these issues. In this paper, we briefly summarize three such atypical features: defect imbalance effects, pulsed beam effects, and carbon contamination. The latter issue has just been recently recognized as being relevant to simulation of void swelling and is discussed here in greater detail. It is shown that carbon ions are entrained in the ion beammore » by Coulomb force drag and accelerated toward the target surface. Beam-contaminant interactions are modeled using molecular dynamics simulation. Finally, by applying a multiple beam deflection technique, carbon and other contaminants can be effectively filtered out, as demonstrated in an irradiation of HT-9 alloy by 3.5 MeV Fe ions.« less
Knapp, B; Frantal, S; Cibena, M; Schreiner, W; Bauer, P
2011-08-01
Molecular dynamics is a commonly used technique in computational biology. One key issue of each molecular dynamics simulation is: When does this simulation reach equilibrium state? A widely used way to determine this is the visual and intuitive inspection of root mean square deviation (RMSD) plots of the simulation. Although this technique has been criticized several times, it is still often used. Therefore, we present a study proving that this method is not reliable at all. We conducted a survey with participants from the field in which we illustrated different RMSD plots to scientists in the field of molecular dynamics. These plots were randomized and repeated, using a statistical model and different variants of the plots. We show that there is no mutual consent about the point of equilibrium. The decisions are severely biased by different parameters. Therefore, we conclude that scientists should not discuss the equilibration of a molecular dynamics simulation on the basis of a RMSD plot.
NASA Astrophysics Data System (ADS)
Kaloop, Mosbeh R.; Yigit, Cemal O.; Hu, Jong W.
2018-03-01
Recently, the high rate global navigation satellite system-precise point positioning (GNSS-PPP) technique has been used to detect the dynamic behavior of structures. This study aimed to increase the accuracy of the extraction oscillation properties of structural movements based on the high-rate (10 Hz) GNSS-PPP monitoring technique. A developmental model based on the combination of wavelet package transformation (WPT) de-noising and neural network prediction (NN) was proposed to improve the dynamic behavior of structures for GNSS-PPP method. A complicated numerical simulation involving highly noisy data and 13 experimental cases with different loads were utilized to confirm the efficiency of the proposed model design and the monitoring technique in detecting the dynamic behavior of structures. The results revealed that, when combined with the proposed model, GNSS-PPP method can be used to accurately detect the dynamic behavior of engineering structures as an alternative to relative GNSS method.
Continuous welding of unidirectional fiber reinforced thermoplastic tape material
NASA Astrophysics Data System (ADS)
Schledjewski, Ralf
2017-10-01
Continuous welding techniques like thermoplastic tape placement with in situ consolidation offer several advantages over traditional manufacturing processes like autoclave consolidation, thermoforming, etc. However, still there is a need to solve several important processing issues before it becomes a viable economic process. Intensive process analysis and optimization has been carried out in the past through experimental investigation, model definition and simulation development. Today process simulation is capable to predict resulting consolidation quality. Effects of material imperfections or process parameter variations are well known. But using this knowledge to control the process based on online process monitoring and according adaption of the process parameters is still challenging. Solving inverse problems and using methods for automated code generation allowing fast implementation of algorithms on targets are required. The paper explains the placement technique in general. Process-material-property-relationships and typical material imperfections are described. Furthermore, online monitoring techniques and how to use them for a model based process control system are presented.
Delchini, Marc O.; Ragusa, Jean C.; Ferguson, Jim
2017-02-17
A viscous regularization technique, based on the local entropy residual, was proposed by Delchini et al. (2015) to stabilize the nonequilibrium-diffusion Grey Radiation-Hydrodynamic equations using an artificial viscosity technique. This viscous regularization is modulated by the local entropy production and is consistent with the entropy minimum principle. However, Delchini et al. (2015) only based their work on the hyperbolic parts of the Grey Radiation-Hydrodynamic equations and thus omitted the relaxation and diffusion terms present in the material energy and radiation energy equations. Here in this paper, we extend the theoretical grounds for the method and derive an entropy minimum principlemore » for the full set of nonequilibrium-diffusion Grey Radiation-Hydrodynamic equations. This further strengthens the applicability of the entropy viscosity method as a stabilization technique for radiation-hydrodynamic shock simulations. Radiative shock calculations using constant and temperature-dependent opacities are compared against semi-analytical reference solutions, and we present a procedure to perform spatial convergence studies of such simulations.« less
Failure Diagnosis for the Holdup Tank System via ISFA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Huijuan; Bragg-Sitton, Shannon; Smidts, Carol
This paper discusses the use of the integrated system failure analysis (ISFA) technique for fault diagnosis for the holdup tank system. ISFA is a simulation-based, qualitative and integrated approach used to study fault propagation in systems containing both hardware and software subsystems. The holdup tank system consists of a tank containing a fluid whose level is controlled by an inlet valve and an outlet valve. We introduce the component and functional models of the system, quantify the main parameters and simulate possible failure-propagation paths based on the fault propagation approach, ISFA. The results show that most component failures in themore » holdup tank system can be identified clearly and that ISFA is viable as a technique for fault diagnosis. Since ISFA is a qualitative technique that can be used in the very early stages of system design, this case study provides indications that it can be used early to study design aspects that relate to robustness and fault tolerance.« less
A novel coupling of noise reduction algorithms for particle flow simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimoń, M.J., E-mail: malgorzata.zimon@stfc.ac.uk; James Weir Fluids Lab, Mechanical and Aerospace Engineering Department, The University of Strathclyde, Glasgow G1 1XJ; Reese, J.M.
2016-09-15
Proper orthogonal decomposition (POD) and its extension based on time-windows have been shown to greatly improve the effectiveness of recovering smooth ensemble solutions from noisy particle data. However, to successfully de-noise any molecular system, a large number of measurements still need to be provided. In order to achieve a better efficiency in processing time-dependent fields, we have combined POD with a well-established signal processing technique, wavelet-based thresholding. In this novel hybrid procedure, the wavelet filtering is applied within the POD domain and referred to as WAVinPOD. The algorithm exhibits promising results when applied to both synthetically generated signals and particlemore » data. In this work, the simulations compare the performance of our new approach with standard POD or wavelet analysis in extracting smooth profiles from noisy velocity and density fields. Numerical examples include molecular dynamics and dissipative particle dynamics simulations of unsteady force- and shear-driven liquid flows, as well as phase separation phenomenon. Simulation results confirm that WAVinPOD preserves the dimensionality reduction obtained using POD, while improving its filtering properties through the sparse representation of data in wavelet basis. This paper shows that WAVinPOD outperforms the other estimators for both synthetically generated signals and particle-based measurements, achieving a higher signal-to-noise ratio from a smaller number of samples. The new filtering methodology offers significant computational savings, particularly for multi-scale applications seeking to couple continuum informations with atomistic models. It is the first time that a rigorous analysis has compared de-noising techniques for particle-based fluid simulations.« less
Digital video timing analyzer for the evaluation of PC-based real-time simulation systems
NASA Astrophysics Data System (ADS)
Jones, Shawn R.; Crosby, Jay L.; Terry, John E., Jr.
2009-05-01
Due to the rapid acceleration in technology and the drop in costs, the use of commercial off-the-shelf (COTS) PC-based hardware and software components for digital and hardware-in-the-loop (HWIL) simulations has increased. However, the increase in PC-based components creates new challenges for HWIL test facilities such as cost-effective hardware and software selection, system configuration and integration, performance testing, and simulation verification/validation. This paper will discuss how the Digital Video Timing Analyzer (DiViTA) installed in the Aviation and Missile Research, Development and Engineering Center (AMRDEC) provides quantitative characterization data for PC-based real-time scene generation systems. An overview of the DiViTA is provided followed by details on measurement techniques, applications, and real-world examples of system benefits.
NASA Astrophysics Data System (ADS)
Wilson, Robert H.; Vishwanath, Karthik; Mycek, Mary-Ann
2009-02-01
Monte Carlo (MC) simulations are considered the "gold standard" for mathematical description of photon transport in tissue, but they can require large computation times. Therefore, it is important to develop simple and efficient methods for accelerating MC simulations, especially when a large "library" of related simulations is needed. A semi-analytical method involving MC simulations and a path-integral (PI) based scaling technique generated time-resolved reflectance curves from layered tissue models. First, a zero-absorption MC simulation was run for a tissue model with fixed scattering properties in each layer. Then, a closed-form expression for the average classical path of a photon in tissue was used to determine the percentage of time that the photon spent in each layer, to create a weighted Beer-Lambert factor to scale the time-resolved reflectance of the simulated zero-absorption tissue model. This method is a unique alternative to other scaling techniques in that it does not require the path length or number of collisions of each photon to be stored during the initial simulation. Effects of various layer thicknesses and absorption and scattering coefficients on the accuracy of the method will be discussed.
Employment of adaptive learning techniques for the discrimination of acoustic emissions
NASA Astrophysics Data System (ADS)
Erkes, J. W.; McDonald, J. F.; Scarton, H. A.; Tam, K. C.; Kraft, R. P.
1983-11-01
The following aspects of this study on the discrimination of acoustic emissions (AE) were examined: (1) The analytical development and assessment of digital signal processing techniques for AE signal dereverberation, noise reduction, and source characterization; (2) The modeling and verification of some aspects of key selected techniques through a computer-based simulation; and (3) The study of signal propagation physics and their effect on received signal characteristics for relevant physical situations.
Fostering Learning Through Interprofessional Virtual Reality Simulation Development.
Nicely, Stephanie; Farra, Sharon
2015-01-01
This article presents a unique strategy for improving didactic learning and clinical skill while simultaneously fostering interprofessional collaboration and communication. Senior-level nursing students collaborated with students enrolled in the Department of Interactive Media Studies to design a virtual reality simulation based upon disaster management and triage techniques. Collaborative creation of the simulation proved to be a strategy for enhancing students' knowledge of and skill in disaster management and triage while impacting attitudes about interprofessional communication and teamwork.
Particle identification using the time-over-threshold measurements in straw tube detectors
NASA Astrophysics Data System (ADS)
Jowzaee, S.; Fioravanti, E.; Gianotti, P.; Idzik, M.; Korcyl, G.; Palka, M.; Przyborowski, D.; Pysz, K.; Ritman, J.; Salabura, P.; Savrie, M.; Smyrski, J.; Strzempek, P.; Wintz, P.
2013-08-01
The identification of charged particles based on energy losses in straw tube detectors has been simulated. The response of a new front-end chip developed for the PANDA straw tube tracker was implemented in the simulations and corrections for track distance to sense wire were included. Separation power for p - K, p - π and K - π pairs obtained using the time-over-threshold technique was compared with the one based on the measurement of collected charge.
Simulation of a compact analyzer-based imaging system with a regular x-ray source
NASA Astrophysics Data System (ADS)
Caudevilla, Oriol; Zhou, Wei; Stoupin, Stanislav; Verman, Boris; Brankov, J. G.
2017-03-01
Analyzer-based Imaging (ABI) belongs to a broader family of phase-contrast (PC) X-ray techniques. PC measures X-ray deflection phenomena when interacting with a sample, which is known to provide higher contrast images of soft tissue than other X-ray methods. This is of high interest in the medical field, in particular for mammogram applications. This paper presents a simulation tool for table-top ABI systems using a conventional polychromatic X-ray source.
FPGA in-the-loop simulations of cardiac excitation model under voltage clamp conditions
NASA Astrophysics Data System (ADS)
Othman, Norliza; Adon, Nur Atiqah; Mahmud, Farhanahani
2017-01-01
Voltage clamp technique allows the detection of single channel currents in biological membranes in identifying variety of electrophysiological problems in the cellular level. In this paper, a simulation study of the voltage clamp technique has been presented to analyse current-voltage (I-V) characteristics of ion currents based on Luo-Rudy Phase-I (LR-I) cardiac model by using a Field Programmable Gate Array (FPGA). Nowadays, cardiac models are becoming increasingly complex which can cause a vast amount of time to run the simulation. Thus, a real-time hardware implementation using FPGA could be one of the best solutions for high-performance real-time systems as it provides high configurability and performance, and able to executes in parallel mode operation. For shorter time development while retaining high confidence results, FPGA-based rapid prototyping through HDL Coder from MATLAB software has been used to construct the algorithm for the simulation system. Basically, the HDL Coder is capable to convert the designed MATLAB Simulink blocks into hardware description language (HDL) for the FPGA implementation. As a result, the voltage-clamp fixed-point design of LR-I model has been successfully conducted in MATLAB Simulink and the simulation of the I-V characteristics of the ionic currents has been verified on Xilinx FPGA Virtex-6 XC6VLX240T development board through an FPGA-in-the-loop (FIL) simulation.
Computer Science Techniques Applied to Parallel Atomistic Simulation
NASA Astrophysics Data System (ADS)
Nakano, Aiichiro
1998-03-01
Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.
NASA Technical Reports Server (NTRS)
Meyers, James F.
2004-01-01
The historical development of techniques for measuring three velocity components using laser velocimetry is presented. The techniques are described and their relative merits presented. Many of the approaches currently in use based on the fringe laser velocimeter have yielded inaccurate measurements of turbulence intensity in the on-axis component. A possible explanation for these inaccuracies is presented along with simulation results.
Peter R. Robichaud
1997-01-01
Geostatistics provides a method to describe the spatial continuity of many natural phenomena. Spatial models are based upon the concept of scaling, kriging and conditional simulation. These techniques were used to describe the spatially-varied surface conditions on timber harvest and burned hillslopes. Geostatistical techniques provided estimates of the ground cover (...
Integrating Text-to-Speech Software into Pedagogically Sound Teaching and Learning Scenarios
ERIC Educational Resources Information Center
Rughooputh, S. D. D. V.; Santally, M. I.
2009-01-01
This paper presents a new technique of delivery of classes--an instructional technique which will no doubt revolutionize the teaching and learning, whether for on-campus, blended or online modules. This is based on the simple task of instructionally incorporating text-to-speech software embedded in the lecture slides that will simulate exactly the…
Ahmad, Zaki Uddin; Chao, Bing; Konggidinata, Mas Iwan; Lian, Qiyu; Zappi, Mark E; Gang, Daniel Dianchen
2018-04-27
Numerous research works have been devoted in the adsorption area using experimental approaches. All these approaches are based on trial and error process and extremely time consuming. Molecular simulation technique is a new tool that can be used to design and predict the performance of an adsorbent. This research proposed a simulation technique that can greatly reduce the time in designing the adsorbent. In this study, a new Rhombic ordered mesoporous carbon (OMC) model is proposed and constructed with various pore sizes and oxygen contents using Materials Visualizer Module to optimize the structure of OMC for resorcinol adsorption. The specific surface area, pore volume, small angle X-ray diffraction pattern, and resorcinol adsorption capacity were calculated by Forcite and Sorption module in Materials Studio Package. The simulation results were validated experimentally through synthesizing OMC with different pore sizes and oxygen contents prepared via hard template method employing SBA-15 silica scaffold. Boric acid was used as the pore expanding reagent to synthesize OMC with different pore sizes (from 4.6 to 11.3 nm) and varying oxygen contents (from 11.9% to 17.8%). Based on the simulation and experimental validation, the optimal pore size was found to be 6 nm for maximum adsorption of resorcinol. Copyright © 2018 Elsevier B.V. All rights reserved.
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.
Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus
2017-01-01
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator
Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus
2017-01-01
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730
NASA Astrophysics Data System (ADS)
Bhuiyan, M. A. E.; Nikolopoulos, E. I.; Anagnostou, E. N.
2017-12-01
Quantifying the uncertainty of global precipitation datasets is beneficial when using these precipitation products in hydrological applications, because precipitation uncertainty propagation through hydrologic modeling can significantly affect the accuracy of the simulated hydrologic variables. In this research the Iberian Peninsula has been used as the study area with a study period spanning eleven years (2000-2010). This study evaluates the performance of multiple hydrologic models forced with combined global rainfall estimates derived based on a Quantile Regression Forests (QRF) technique. In QRF technique three satellite precipitation products (CMORPH, PERSIANN, and 3B42 (V7)); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset are being utilized in this study. A high-resolution, ground-based observations driven precipitation dataset (named SAFRAN) available at 5 km/1 h resolution is used as reference. Through the QRF blending framework the stochastic error model produces error-adjusted ensemble precipitation realizations, which are used to force four global hydrological models (JULES (Joint UK Land Environment Simulator), WaterGAP3 (Water-Global Assessment and Prognosis), ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) and SURFEX (Stands for Surface Externalisée) ) to simulate three hydrologic variables (surface runoff, subsurface runoff and evapotranspiration). The models are forced with the reference precipitation to generate reference-based hydrologic simulations. This study presents a comparative analysis of multiple hydrologic model simulations for different hydrologic variables and the impact of the blending algorithm on the simulated hydrologic variables. Results show how precipitation uncertainty propagates through the different hydrologic model structures to manifest in reduction of error in hydrologic variables.
Ground Vibration Test Planning and Pre-Test Analysis for the X-33 Vehicle
NASA Technical Reports Server (NTRS)
Bedrossian, Herand; Tinker, Michael L.; Hidalgo, Homero
2000-01-01
This paper describes the results of the modal test planning and the pre-test analysis for the X-33 vehicle. The pre-test analysis included the selection of the target modes, selection of the sensor and shaker locations and the development of an accurate Test Analysis Model (TAM). For target mode selection, four techniques were considered, one based on the Modal Cost technique, one based on Balanced Singular Value technique, a technique known as the Root Sum Squared (RSS) method, and a Modal Kinetic Energy (MKE) approach. For selecting sensor locations, four techniques were also considered; one based on the Weighted Average Kinetic Energy (WAKE), one based on Guyan Reduction (GR), one emphasizing engineering judgment, and one based on an optimum sensor selection technique using Genetic Algorithm (GA) search technique combined with a criteria based on Hankel Singular Values (HSV's). For selecting shaker locations, four techniques were also considered; one based on the Weighted Average Driving Point Residue (WADPR), one based on engineering judgment and accessibility considerations, a frequency response method, and an optimum shaker location selection based on a GA search technique combined with a criteria based on HSV's. To evaluate the effectiveness of the proposed sensor and shaker locations for exciting the target modes, extensive numerical simulations were performed. Multivariate Mode Indicator Function (MMIF) was used to evaluate the effectiveness of each sensor & shaker set with respect to modal parameter identification. Several TAM reduction techniques were considered including, Guyan, IRS, Modal, and Hybrid. Based on a pre-test cross-orthogonality checks using various reduction techniques, a Hybrid TAM reduction technique was selected and was used for all three vehicle fuel level configurations.
Numerical Simulation of Non-Thermal Food Preservation
NASA Astrophysics Data System (ADS)
Rauh, C.; Krauss, J.; Ertunc, Ö.; Delgado, a.
2010-09-01
Food preservation is an important process step in food technology regarding product safety and product quality. Novel preservation techniques are currently developed, that aim at improved sensory and nutritional value but comparable safety than in conventional thermal preservation techniques. These novel non-thermal food preservation techniques are based for example on high pressures up to one GPa or pulsed electric fields. in literature studies the high potential of high pressures (HP) and of pulsed electric fields (PEF) is shown due to their high retention of valuable food components as vitamins and flavour and selective inactivation of spoiling enzymes and microorganisms. for the design of preservation processes based on the non-thermal techniques it is crucial to predict the effect of high pressure and pulsed electric fields on the food components and on the spoiling enzymes and microorganisms locally and time-dependent in the treated product. Homogenous process conditions (especially of temperature fields in HP and PEF processing and of electric fields in PEF) are aimed at to avoid the need of over-processing and the connected quality loss and to minimize safety risks due to under-processing. the present contribution presents numerical simulations of thermofluiddynamical phenomena inside of high pressure autoclaves and pulsed electric field treatment chambers. in PEF processing additionally the electric fields are considered. Implementing kinetics of occurring (bio-) chemical reactions in the numerical simulations of the temperature, flow and electric fields enables the evaluation of the process homogeneity and efficiency connected to different process parameters of the preservation techniques. Suggestions to achieve safe and high quality products are concluded out of the numerical results.
NASA Astrophysics Data System (ADS)
Gerszewski, Daniel James
Physical simulation has become an essential tool in computer animation. As the use of visual effects increases, the need for simulating real-world materials increases. In this dissertation, we consider three problems in physics-based animation: large-scale splashing liquids, elastoplastic material simulation, and dimensionality reduction techniques for fluid simulation. Fluid simulation has been one of the greatest successes of physics-based animation, generating hundreds of research papers and a great many special effects over the last fifteen years. However, the animation of large-scale, splashing liquids remains challenging. We show that a novel combination of unilateral incompressibility, mass-full FLIP, and blurred boundaries is extremely well-suited to the animation of large-scale, violent, splashing liquids. Materials that incorporate both plastic and elastic deformations, also referred to as elastioplastic materials, are frequently encountered in everyday life. Methods for animating such common real-world materials are useful for effects practitioners and have been successfully employed in films. We describe a point-based method for animating elastoplastic materials. Our primary contribution is a simple method for computing the deformation gradient for each particle in the simulation. Given the deformation gradient, we can apply arbitrary constitutive models and compute the resulting elastic forces. Our method has two primary advantages: we do not store or compare to an initial rest configuration and we work directly with the deformation gradient. The first advantage avoids poor numerical conditioning and the second naturally leads to a multiplicative model of deformation appropriate for finite deformations. One of the most significant drawbacks of physics-based animation is that ever-higher fidelity leads to an explosion in the number of degrees of freedom. This problem leads us to the consideration of dimensionality reduction techniques. We present several enhancements to model-reduced fluid simulation that allow improved simulation bases and two-way solid-fluid coupling. Specifically, we present a basis enrichment scheme that allows us to combine data-driven or artistically derived bases with more general analytic bases derived from Laplacian Eigenfunctions. Additionally, we handle two-way solid-fluid coupling in a time-splitting fashion---we alternately timestep the fluid and rigid body simulators, while taking into account the effects of the fluid on the rigid bodies and vice versa. We employ the vortex panel method to handle solid-fluid coupling and use dynamic pressure to compute the effect of the fluid on rigid bodies. Taken together, these contributions have advanced the state-of-the art in physics-based animation and are practical enough to be used in production pipelines.
Molecular-level simulations of turbulence and its decay
Gallis, M. A.; Bitter, N. P.; Koehler, T. P.; ...
2017-02-08
Here, we provide the first demonstration that molecular-level methods based on gas kinetic theory and molecular chaos can simulate turbulence and its decay. The direct simulation Monte Carlo (DSMC) method, a molecular-level technique for simulating gas flows that resolves phenomena from molecular to hydrodynamic (continuum) length scales, is applied to simulate the Taylor-Green vortex flow. The DSMC simulations reproduce the Kolmogorov –5/3 law and agree well with the turbulent kinetic energy and energy dissipation rate obtained from direct numerical simulation of the Navier-Stokes equations using a spectral method. This agreement provides strong evidence that molecular-level methods for gases can bemore » used to investigate turbulent flows quantitatively.« less
An ionospheric occultation inversion technique based on epoch difference
NASA Astrophysics Data System (ADS)
Lin, Jian; Xiong, Jing; Zhu, Fuying; Yang, Jian; Qiao, Xuejun
2013-09-01
Of the ionospheric radio occultation (IRO) electron density profile (EDP) retrievals, the Abel based calibrated TEC inversion (CTI) is the most widely used technique. In order to eliminate the contribution from the altitude above the RO satellite, it is necessary to utilize the calibrated TEC to retrieve the EDP, which introduces the error due to the coplanar assumption. In this paper, a new technique based on the epoch difference inversion (EDI) is firstly proposed to eliminate this error. The comparisons between CTI and EDI have been done, taking advantage of the simulated and real COSMIC data. The following conclusions can be drawn: the EDI technique can successfully retrieve the EDPs without non-occultation side measurements and shows better performance than the CTI method, especially for lower orbit mission; no matter which technique is used, the inversion results at the higher altitudes are better than those at the lower altitudes, which could be explained theoretically.
Stabilization techniques for reactive aggregate in soil-cement base course : technical summary.
DOT National Transportation Integrated Search
2003-01-01
The objectives of this research are 1) to identify the mineralogical properties of soil-cement bases which have heaved or can potentially heave, 2) to simulate expansion of cement-stabilized soil in the laboratory, 3) to correlate expansion with the ...
Simulation and experimental study of 802.11 based networking for vehicular management and safety.
DOT National Transportation Integrated Search
2009-03-01
This work focuses on the use of wireless networking techniques for their potential impact in providing : information for traffic management, control and public safety goals. The premise of this work is based on the : reasonable expectation that vehic...
Matlashov, Andrei N.; Schultz, Larry J.; Espy, Michelle A.; Kraus, Robert H.; Savukov, Igor M.; Volegov, Petr L.; Wurden, Caroline J.
2011-01-01
Nuclear magnetic resonance (NMR) is widely used in medicine, chemistry and industry. One application area is magnetic resonance imaging (MRI). Recently it has become possible to perform NMR and MRI in the ultra-low field (ULF) regime requiring measurement field strengths of the order of only 1 Gauss. This technique exploits the advantages offered by superconducting quantum interference devices or SQUIDs. Our group has built SQUID based MRI systems for brain imaging and for liquid explosives detection at airport security checkpoints. The requirement for liquid helium cooling limits potential applications of ULF MRI for liquid identification and security purposes. Our experimental comparative investigation shows that room temperature inductive magnetometers may provide enough sensitivity in the 3–10 kHz range and can be used for fast liquid explosives detection based on ULF NMR technique. We describe experimental and computer-simulation results comparing multichannel SQUID based and induction coils based instruments that are capable of performing ULF MRI for liquid identification. PMID:21747638
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
NASA Astrophysics Data System (ADS)
Ding, Huanjun; Gao, Hao; Zhao, Bo; Cho, Hyo-Min; Molloi, Sabee
2014-10-01
Both computer simulations and experimental phantom studies were carried out to investigate the radiation dose reduction with tensor framelet based iterative image reconstruction (TFIR) for a dedicated high-resolution spectral breast computed tomography (CT) based on a silicon strip photon-counting detector. The simulation was performed with a 10 cm-diameter water phantom including three contrast materials (polyethylene, 8 mg ml-1 iodine and B-100 bone-equivalent plastic). In the experimental study, the data were acquired with a 1.3 cm-diameter polymethylmethacrylate (PMMA) phantom containing iodine in three concentrations (8, 16 and 32 mg ml-1) at various radiation doses (1.2, 2.4 and 3.6 mGy) and then CT images were reconstructed using the filtered-back-projection (FBP) technique and the TFIR technique, respectively. The image quality between these two techniques was evaluated by the quantitative analysis on contrast-to-noise ratio (CNR) and spatial resolution that was evaluated using the task-based modulation transfer function (MTF). Both the simulation and experimental results indicated that the task-based MTF obtained from TFIR reconstruction with one-third of the radiation dose was comparable to that from the FBP reconstruction for low contrast target. For high contrast target, the TFIR was substantially superior to the FBP reconstruction in terms of spatial resolution. In addition, TFIR was able to achieve a factor of 1.6-1.8 increase in CNR, depending on the target contrast level. This study demonstrates that the TFIR can reduce the required radiation dose by a factor of two-thirds for a CT image reconstruction compared to the FBP technique. It achieves much better CNR and spatial resolution for high contrast target in addition to retaining similar spatial resolution for low contrast target. This TFIR technique has been implemented with a graphic processing unit system and it takes approximately 10 s to reconstruct a single-slice CT image, which can potentially be used in a future multi-slit multi-slice spiral CT system.
NASA Astrophysics Data System (ADS)
Trinci, G.; Massari, R.; Scandellari, M.; Boccalini, S.; Costantini, S.; Di Sero, R.; Basso, A.; Sala, R.; Scopinaro, F.; Soluri, A.
2010-09-01
The aim of this work is to show a new scintigraphic device able to change automatically the length of its collimator in order to adapt the spatial resolution value to gamma source distance. This patented technique replaces the need for collimator change that standard gamma cameras still feature. Monte Carlo simulations represent the best tool in searching new technological solutions for such an innovative collimation structure. They also provide a valid analysis on response of gamma cameras performances as well as on advantages and limits of this new solution. Specifically, Monte Carlo simulations are realized with GEANT4 (GEometry ANd Tracking) framework and the specific simulation object is a collimation method based on separate blocks that can be brought closer and farther, in order to reach and maintain specific spatial resolution values for all source-detector distances. To verify the accuracy and the faithfulness of these simulations, we have realized experimental measurements with identical setup and conditions. This confirms the power of the simulation as an extremely useful tool, especially where new technological solutions need to be studied, tested and analyzed before their practical realization. The final aim of this new collimation system is the improvement of the SPECT techniques, with the real control of the spatial resolution value during tomographic acquisitions. This principle did allow us to simulate a tomographic acquisition of two capillaries of radioactive solution, in order to verify the possibility to clearly distinguish them.
NASA Technical Reports Server (NTRS)
Feather, J. B.; Joshi, D. S.
1981-01-01
Handling qualities of the unaugmented advanced supersonic transport (AST) are deficient in the low-speed, landing approach regime. Consequently, improvement in handling with active control augmentation systems has been achieved using implicit model-following techniques. Extensive fixed-based simulator evaluations were used to validate these systems prior to tests with full motion and visual capabilities on a six-axis motion-base simulator (MBS). These tests compared the handling qualities of the unaugmented AST with several augmented configurations to ascertain the effectiveness of these systems. Cooper-Harper ratings, tracking errors, and control activity data from the MBS tests have been analyzed statistically. The results show the fully augmented AST handling qualities have been improved to an acceptable level.
The transesophageal echocardiography simulator based on computed tomography images.
Piórkowski, Adam; Kempny, Aleksander
2013-02-01
Simulators are a new tool in education in many fields, including medicine, where they greatly improve familiarity with medical procedures, reduce costs, and, importantly, cause no harm to patients. This is so in the case of transesophageal echocardiography (TEE), in which the use of a simulator facilitates spatial orientation and helps in case studies. The aim of the project described in this paper is to simulate an examination by TEE. This research makes use of available computed tomography data to simulate the corresponding echocardiographic view. This paper describes the essential characteristics that distinguish these two modalities and the key principles of the wave phenomena that should be considered in the simulation process, taking into account the conditions specific to the echocardiography. The construction of the CT2TEE (Web-based TEE simulator) is also presented. The considerations include ray-tracing and ray-casting techniques in the context of ultrasound beam and artifact simulation. An important aspect of the interaction with the user is raised.
3D Modeling of Ultrasonic Wave Interaction with Disbonds and Weak Bonds
NASA Technical Reports Server (NTRS)
Leckey, C.; Hinders, M.
2011-01-01
Ultrasonic techniques, such as the use of guided waves, can be ideal for finding damage in the plate and pipe-like structures used in aerospace applications. However, the interaction of waves with real flaw types and geometries can lead to experimental signals that are difficult to interpret. 3-dimensional (3D) elastic wave simulations can be a powerful tool in understanding the complicated wave scattering involved in flaw detection and for optimizing experimental techniques. We have developed and implemented parallel 3D elastodynamic finite integration technique (3D EFIT) code to investigate Lamb wave scattering from realistic flaws. This paper discusses simulation results for an aluminum-aluminum diffusion disbond and an aluminum-epoxy disbond and compares results from the disbond case to the common artificial flaw type of a flat-bottom hole. The paper also discusses the potential for extending the 3D EFIT equations to incorporate physics-based weak bond models for simulating wave scattering from weak adhesive bonds.
A quantitative investigation of the fracture pump-in/flowback test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plahn, S.V.; Nolte, K.G.; Thompson, L.G.
1997-02-01
Fracture-closure pressure is an important parameter for fracture treatment design and evaluation. The pump-in/flowback (PIFB) test is frequently used to estimate its magnitude. The test is attractive because bottomhole pressures (BHP`s) during flowback develop a distinct and repeatable signature. This is in contrast to the pump-in/shut-in test, where strong indications of fracture closure are rarely seen. Various techniques are used to extract closure pressure from the flowback-pressure response. Unfortunately, these techniques give different estimates for closure pressure, and their theoretical bases are not well established. The authors present results that place the PIFB test on a firmer foundation. A numericalmore » model is used to simulate the PIFB test and glean physical mechanisms contributing to the response. On the basis of their simulation results, they propose interpretation techniques that give better estimates of closure pressure than existing techniques.« less
Simulation of wind turbine wakes using the actuator line technique
Sørensen, Jens N.; Mikkelsen, Robert F.; Henningson, Dan S.; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J.
2015-01-01
The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. PMID:25583862
Trapping hydrogen atoms from a neon-gas matrix: a theoretical simulation.
Bovino, S; Zhang, P; Kharchenko, V; Dalgarno, A
2009-08-07
Hydrogen is of critical importance in atomic and molecular physics and the development of a simple and efficient technique for trapping cold and ultracold hydrogen atoms would be a significant advance. In this study we simulate a recently proposed trap-loading mechanism for trapping hydrogen atoms released from a neon matrix. Accurate ab initio quantum calculations are reported of the neon-hydrogen interaction potential and the energy- and angular-dependent elastic scattering cross sections that control the energy transfer of initially cold atoms are obtained. They are then used to construct the Boltzmann kinetic equation, describing the energy relaxation process. Numerical solutions of the Boltzmann equation predict the time evolution of the hydrogen energy distribution function. Based on the simulations we discuss the prospects of the technique.
NASA Astrophysics Data System (ADS)
Miyagawa, Chihiro; Kobayashi, Takumi; Taishi, Toshinori; Hoshikawa, Keigo
2014-09-01
Based on the growth of 3-inch diameter c-axis sapphire using the vertical Bridgman (VB) technique, numerical simulations were made and used to guide the growth of a 6-inch diameter sapphire. A 2D model of the VB hot-zone was constructed, the seeding interface shape of the 3-inch diameter sapphire as revealed by green laser scattering was estimated numerically, and the temperature distributions of two VB hot-zone models designed for 6-inch diameter sapphire growth were numerically simulated to achieve the optimal growth of large crystals. The hot-zone model with one heater was selected and prepared, and 6-inch diameter c-axis sapphire boules were actually grown, as predicted by the numerical results.
Boda, Dezső; Gillespie, Dirk
2012-03-13
We propose a procedure to compute the steady-state transport of charged particles based on the Nernst-Planck (NP) equation of electrodiffusion. To close the NP equation and to establish a relation between the concentration and electrochemical potential profiles, we introduce the Local Equilibrium Monte Carlo (LEMC) method. In this method, Grand Canonical Monte Carlo simulations are performed using the electrochemical potential specified for the distinct volume elements. An iteration procedure that self-consistently solves the NP and flux continuity equations with LEMC is shown to converge quickly. This NP+LEMC technique can be used in systems with diffusion of charged or uncharged particles in complex three-dimensional geometries, including systems with low concentrations and small applied voltages that are difficult for other particle simulation techniques.
NASA Technical Reports Server (NTRS)
Kreifeldt, J. G.; Parkin, L.; Wempe, T. E.; Huff, E. F.
1975-01-01
Perceived orderliness in the ground tracks of five A/C during their simulated flights was studied. Dynamically developing ground tracks for five A/C from 21 separate runs were reproduced from computer storage and displayed on CRTS to professional pilots and controllers for their evaluations and preferences under several criteria. The ground tracks were developed in 20 seconds as opposed to the 5 minutes of simulated flight using speedup techniques for display. Metric and nonmetric multidimensional scaling techniques are being used to analyze the subjective responses in an effort to: (1) determine the meaningfulness of basing decisions on such complex subjective criteria; (2) compare pilot/controller perceptual spaces; (3) determine the dimensionality of the subjects' perceptual spaces; and thereby (4) determine objective measures suitable for comparing alternative traffic management simulations.
Kim, Mi Jeong; Maeng, Sung Joon; Cho, Yong Soo
2015-01-01
In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation. PMID:26225974
Kim, Mi Jeong; Maeng, Sung Joon; Cho, Yong Soo
2015-07-28
In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation.
NASA Astrophysics Data System (ADS)
Shrivastava, Akash; Mohanty, A. R.
2018-03-01
This paper proposes a model-based method to estimate single plane unbalance parameters (amplitude and phase angle) in a rotor using Kalman filter and recursive least square based input force estimation technique. Kalman filter based input force estimation technique requires state-space model and response measurements. A modified system equivalent reduction expansion process (SEREP) technique is employed to obtain a reduced-order model of the rotor system so that limited response measurements can be used. The method is demonstrated using numerical simulations on a rotor-disk-bearing system. Results are presented for different measurement sets including displacement, velocity, and rotational response. Effects of measurement noise level, filter parameters (process noise covariance and forgetting factor), and modeling error are also presented and it is observed that the unbalance parameter estimation is robust with respect to measurement noise.
NASA Astrophysics Data System (ADS)
Makahinda, T.
2018-02-01
The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.
AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, D.; Alfonsi, A.; Talbot, P.
2016-10-01
The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less
Convolutional coding results for the MVM '73 X-band telemetry experiment
NASA Technical Reports Server (NTRS)
Layland, J. W.
1978-01-01
Results of simulation of several short-constraint-length convolutional codes using a noisy symbol stream obtained via the turnaround ranging channels of the MVM'73 spacecraft are presented. First operational use of this coding technique is on the Voyager mission. The relative performance of these codes in this environment is as previously predicted from computer-based simulations.
Estimation of discontinuous coefficients in parabolic systems: Applications to reservoir simulation
NASA Technical Reports Server (NTRS)
Lamm, P. D.
1984-01-01
Spline based techniques for estimating spatially varying parameters that appear in parabolic distributed systems (typical of those found in reservoir simulation problems) are presented. The problem of determining discontinuous coefficients, estimating both the functional shape and points of discontinuity for such parameters is discussed. Convergence results and a summary of numerical performance of the resulting algorithms are given.
NDE and SHM Simulation for CFRP Composites
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Parker, F. Raymond
2014-01-01
Ultrasound-based nondestructive evaluation (NDE) is a common technique for damage detection in composite materials. There is a need for advanced NDE that goes beyond damage detection to damage quantification and characterization in order to enable data driven prognostics. The damage types that exist in carbon fiber-reinforced polymer (CFRP) composites include microcracking and delaminations, and can be initiated and grown via impact forces (due to ground vehicles, tool drops, bird strikes, etc), fatigue, and extreme environmental changes. X-ray microfocus computed tomography data, among other methods, have shown that these damage types often result in voids/discontinuities of a complex volumetric shape. The specific damage geometry and location within ply layers affect damage growth. Realistic threedimensional NDE and structural health monitoring (SHM) simulations can aid in the development and optimization of damage quantification and characterization techniques. This paper is an overview of ongoing work towards realistic NDE and SHM simulation tools for composites, and also discusses NASA's need for such simulation tools in aeronautics and spaceflight. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with realistic 3-dimensional damage in CFRP composites. The custom code uses elastodynamic finite integration technique and is parallelized to run efficiently on computing cluster or multicore machines.
Technique Developed for Optimizing Traveling-Wave Tubes
NASA Technical Reports Server (NTRS)
Wilson, Jeffrey D.
1999-01-01
A traveling-wave tube (TWT) is an electron beam device that is used to amplify electromagnetic communication waves at radio and microwave frequencies. TWT s are critical components in deep-space probes, geosynchronous communication satellites, and high-power radar systems. Power efficiency is of paramount importance for TWT s employed in deep-space probes and communications satellites. Consequently, increasing the power efficiency of TWT s has been the primary goal of the TWT group at the NASA Lewis Research Center over the last 25 years. An in-house effort produced a technique (ref. 1) to design TWT's for optimized power efficiency. This technique is based on simulated annealing, which has an advantage over conventional optimization techniques in that it enables the best possible solution to be obtained (ref. 2). A simulated annealing algorithm was created and integrated into the NASA TWT computer model (ref. 3). The new technique almost doubled the computed conversion power efficiency of a TWT from 7.1 to 13.5 percent (ref. 1).
C-arm technique using distance driven method for nephrolithiasis and kidney stones detection
NASA Astrophysics Data System (ADS)
Malalla, Nuhad; Sun, Pengfei; Chen, Ying; Lipkin, Michael E.; Preminger, Glenn M.; Qin, Jun
2016-04-01
Distance driven represents a state of art method that used for reconstruction for x-ray techniques. C-arm tomography is an x-ray imaging technique that provides three dimensional information of the object by moving the C-shaped gantry around the patient. With limited view angle, C-arm system was investigated to generate volumetric data of the object with low radiation dosage and examination time. This paper is a new simulation study with two reconstruction methods based on distance driven including: simultaneous algebraic reconstruction technique (SART) and Maximum Likelihood expectation maximization (MLEM). Distance driven is an efficient method that has low computation cost and free artifacts compared with other methods such as ray driven and pixel driven methods. Projection images of spherical objects were simulated with a virtual C-arm system with a total view angle of 40 degrees. Results show the ability of limited angle C-arm technique to generate three dimensional images with distance driven reconstruction.
Active Vibration damping of Smart composite beams based on system identification technique
NASA Astrophysics Data System (ADS)
Bendine, Kouider; Satla, Zouaoui; Boukhoulda, Farouk Benallel; Nouari, Mohammed
2018-03-01
In the present paper, the active vibration control of a composite beam using piezoelectric actuator is investigated. The space state equation is determined using system identification technique based on the structure input output response provided by ANSYS APDL finite element package. The Linear Quadratic (LQG) control law is designed and integrated into ANSYS APDL to perform closed loop simulations. Numerical examples for different types of excitation loads are presented to test the efficiency and the accuracy of the proposed model.
HEAVY AND THERMAL OIL RECOVERY PRODUCTION MECHANISMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony R. Kovscek
2003-04-01
This technical progress report describes work performed from January 1 through March 31, 2003 for the project ''Heavy and Thermal Oil Recovery Production Mechanisms,'' DE-FC26-00BC15311. In this project, a broad spectrum of research is undertaken related to thermal and heavy-oil recovery. The research tools and techniques span from pore-level imaging of multiphase fluid flow to definition of reservoir-scale features through streamline-based history matching techniques. During this period, previous analysis of experimental data regarding multidimensional imbibition to obtain shape factors appropriate for dual-porosity simulation was verified by comparison among analytic, dual-porosity simulation, and fine-grid simulation. We continued to study the mechanismsmore » by which oil is produced from fractured porous media at high pressure and high temperature. Temperature has a beneficial effect on recovery and reduces residual oil saturation. A new experiment was conducted on diatomite core. Significantly, we show that elevated temperature induces fines release in sandstone cores and this behavior may be linked to wettability. Our work in the area of primary production of heavy oil continues with field cores and crude oil. On the topic of reservoir definition, work continued on developing techniques that integrate production history into reservoir models using streamline-based properties.« less
Stabilized finite element methods to simulate the conductances of ion channels
NASA Astrophysics Data System (ADS)
Tu, Bin; Xie, Yan; Zhang, Linbo; Lu, Benzhuo
2015-03-01
We have previously developed a finite element simulator, ichannel, to simulate ion transport through three-dimensional ion channel systems via solving the Poisson-Nernst-Planck equations (PNP) and Size-modified Poisson-Nernst-Planck equations (SMPNP), and succeeded in simulating some ion channel systems. However, the iterative solution between the coupled Poisson equation and the Nernst-Planck equations has difficulty converging for some large systems. One reason we found is that the NP equations are advection-dominated diffusion equations, which causes troubles in the usual FE solution. The stabilized schemes have been applied to compute fluids flow in various research fields. However, they have not been studied in the simulation of ion transport through three-dimensional models based on experimentally determined ion channel structures. In this paper, two stabilized techniques, the SUPG and the Pseudo Residual-Free Bubble function (PRFB) are introduced to enhance the numerical robustness and convergence performance of the finite element algorithm in ichannel. The conductances of the voltage dependent anion channel (VDAC) and the anthrax toxin protective antigen pore (PA) are simulated to validate the stabilization techniques. Those two stabilized schemes give reasonable results for the two proteins, with decent agreement with both experimental data and Brownian dynamics (BD) simulations. For a variety of numerical tests, it is found that the simulator effectively avoids previous numerical instability after introducing the stabilization methods. Comparison based on our test data set between the two stabilized schemes indicates both SUPG and PRFB have similar performance (the latter is slightly more accurate and stable), while SUPG is relatively more convenient to implement.
A three-dimensional wide-angle BPM for optical waveguide structures.
Ma, Changbao; Van Keuren, Edward
2007-01-22
Algorithms for effective modeling of optical propagation in three- dimensional waveguide structures are critical for the design of photonic devices. We present a three-dimensional (3-D) wide-angle beam propagation method (WA-BPM) using Hoekstra's scheme. A sparse matrix algebraic equation is formed and solved using iterative methods. The applicability, accuracy and effectiveness of our method are demonstrated by applying it to simulations of wide-angle beam propagation, along with a technique for shifting the simulation window to reduce the dimension of the numerical equation and a threshold technique to further ensure its convergence. These techniques can ensure the implementation of iterative methods for waveguide structures by relaxing the convergence problem, which will further enable us to develop higher-order 3-D WA-BPMs based on Padé approximant operators.
A three-dimensional wide-angle BPM for optical waveguide structures
NASA Astrophysics Data System (ADS)
Ma, Changbao; van Keuren, Edward
2007-01-01
Algorithms for effective modeling of optical propagation in three- dimensional waveguide structures are critical for the design of photonic devices. We present a three-dimensional (3-D) wide-angle beam propagation method (WA-BPM) using Hoekstra’s scheme. A sparse matrix algebraic equation is formed and solved using iterative methods. The applicability, accuracy and effectiveness of our method are demonstrated by applying it to simulations of wide-angle beam propagation, along with a technique for shifting the simulation window to reduce the dimension of the numerical equation and a threshold technique to further ensure its convergence. These techniques can ensure the implementation of iterative methods for waveguide structures by relaxing the convergence problem, which will further enable us to develop higher-order 3-D WA-BPMs based on Padé approximant operators.
Simulation of tunneling construction methods of the Cisumdawu toll road
NASA Astrophysics Data System (ADS)
Abduh, Muhamad; Sukardi, Sapto Nugroho; Ola, Muhammad Rusdian La; Ariesty, Anita; Wirahadikusumah, Reini D.
2017-11-01
Simulation can be used as a tool for planning and analysis of a construction method. Using simulation technique, a contractor could design optimally resources associated with a construction method and compare to other methods based on several criteria, such as productivity, waste, and cost. This paper discusses the use of simulation using Norwegian Method of Tunneling (NMT) for a 472-meter tunneling work in the Cisumdawu Toll Road project. Primary and secondary data were collected to provide useful information for simulation as well as problems that may be faced by the contractor. The method was modelled using the CYCLONE and then simulated using the WebCYCLONE. The simulation could show the duration of the project from the duration model of each work tasks which based on literature review, machine productivity, and several assumptions. The results of simulation could also show the total cost of the project that was modeled based on journal construction & building unit cost and online websites of local and international suppliers. The analysis of the advantages and disadvantages of the method was conducted based on its, wastes, and cost. The simulation concluded the total cost of this operation is about Rp. 900,437,004,599 and the total duration of the tunneling operation is 653 days. The results of the simulation will be used for a recommendation to the contractor before the implementation of the already selected tunneling operation.
GPS Based Spacecraft Attitude Determination
1993-09-30
AD-A271 734 GPS Based Spacecraft Attitude Determination Final Report for October 1992- September 1993 to the Naval Research Laboratory Prepared by .F...ethods ....................................................................... 7 4. Spacecraft Attitude and Orbit Determination... attitude determination techniques to near-Earth spacecraft. The areas addressed include solution algorithms, simulation of the spacecraft and
NASA Astrophysics Data System (ADS)
Demir, I.
2014-12-01
Recent developments in internet technologies make it possible to manage and visualize large data on the web. Novel visualization techniques and interactive user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The hydrological simulation system is a web-based 3D interactive learning environment for teaching hydrological processes and concepts. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create or load predefined scenarios, control environmental parameters, and evaluate environmental mitigation alternatives. The web-based simulation system provides an environment for students to learn about the hydrological processes (e.g. flooding and flood damage), and effects of development and human activity in the floodplain. The system utilizes latest web technologies and graphics processing unit (GPU) for water simulation and object collisions on the terrain. Users can access the system in three visualization modes including virtual reality, augmented reality, and immersive reality using heads-up display. The system provides various scenarios customized to fit the age and education level of various users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various visualization and interaction modes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, H. M. Abdul; Ukkusuri, Satish V.
We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less
Aziz, H. M. Abdul; Ukkusuri, Satish V.
2017-06-29
We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less
Crowdsourcing: A Primer and Its implications for Systems Engineering
2012-08-01
detailing areas to be improved within current crowdsourcing frameworks. Finally, an agent-based simulation using machine learning techniques is defined, preliminary results are presented, and future research directions are described.
Study on the variable cycle engine modeling techniques based on the component method
NASA Astrophysics Data System (ADS)
Zhang, Lihua; Xue, Hui; Bao, Yuhai; Li, Jijun; Yan, Lan
2016-01-01
Based on the structure platform of the gas turbine engine, the components of variable cycle engine were simulated by using the component method. The mathematical model of nonlinear equations correspondeing to each component of the gas turbine engine was established. Based on Matlab programming, the nonlinear equations were solved by using Newton-Raphson steady-state algorithm, and the performance of the components for engine was calculated. The numerical simulation results showed that the model bulit can describe the basic performance of the gas turbine engine, which verified the validity of the model.
An improved simulation based biomechanical model to estimate static muscle loadings
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar L.; Marras, William S.; Woolford, Barbara
1991-01-01
The objectives of this study are to show that the characteristics of an intact muscle are different from those of an isolated muscle and to describe a simulation based model. This model, unlike the optimization based models, accounts for the redundancy in the musculoskeletal system in predicting the amount of forces generated within a muscle. The results of this study show that the loading of the primary muscle is increased by the presence of other muscle activities. Hence, the previous models based on optimization techniques may underestimate the severity of the muscle and joint loadings which occur during manual material handling tasks.
NASA Technical Reports Server (NTRS)
Woodward, W. A.; Gray, H. L.
1983-01-01
Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.
Optimal Sensor Management and Signal Processing for New EMI Systems
2010-09-01
adaptive techniques that would improve the speed of data collection and increase the mobility of a TEMTADS system. Although an active learning technique...data, SIG has simulated the active selection based on the data already collected at Camp SLO. In this setup, the active learning approach was constrained...to work only on a 5x5 grid (corresponding to twenty five transmitters and co-located receivers). The first technique assumes that active learning will
Multiscale simulation of molecular processes in cellular environments.
Chiricotto, Mara; Sterpone, Fabio; Derreumaux, Philippe; Melchionna, Simone
2016-11-13
We describe the recent advances in studying biological systems via multiscale simulations. Our scheme is based on a coarse-grained representation of the macromolecules and a mesoscopic description of the solvent. The dual technique handles particles, the aqueous solvent and their mutual exchange of forces resulting in a stable and accurate methodology allowing biosystems of unprecedented size to be simulated.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).
Angeler, David G; Viedma, Olga; Moreno, José M
2009-11-01
Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.
High order discretization techniques for real-space ab initio simulations
NASA Astrophysics Data System (ADS)
Anderson, Christopher R.
2018-03-01
In this paper, we present discretization techniques to address numerical problems that arise when constructing ab initio approximations that use real-space computational grids. We present techniques to accommodate the singular nature of idealized nuclear and idealized electronic potentials, and we demonstrate the utility of using high order accurate grid based approximations to Poisson's equation in unbounded domains. To demonstrate the accuracy of these techniques, we present results for a Full Configuration Interaction computation of the dissociation of H2 using a computed, configuration dependent, orbital basis set.
DEPEND: A simulation-based environment for system level dependability analysis
NASA Technical Reports Server (NTRS)
Goswami, Kumar; Iyer, Ravishankar K.
1992-01-01
The design and evaluation of highly reliable computer systems is a complex issue. Designers mostly develop such systems based on prior knowledge and experience and occasionally from analytical evaluations of simplified designs. A simulation-based environment called DEPEND which is especially geared for the design and evaluation of fault-tolerant architectures is presented. DEPEND is unique in that it exploits the properties of object-oriented programming to provide a flexible framework with which a user can rapidly model and evaluate various fault-tolerant systems. The key features of the DEPEND environment are described, and its capabilities are illustrated with a detailed analysis of a real design. In particular, DEPEND is used to simulate the Unix based Tandem Integrity fault-tolerance and evaluate how well it handles near-coincident errors caused by correlated and latent faults. Issues such as memory scrubbing, re-integration policies, and workload dependent repair times which affect how the system handles near-coincident errors are also evaluated. Issues such as the method used by DEPEND to simulate error latency and the time acceleration technique that provides enormous simulation speed up are also discussed. Unlike any other simulation-based dependability studies, the use of these approaches and the accuracy of the simulation model are validated by comparing the results of the simulations, with measurements obtained from fault injection experiments conducted on a production Tandem Integrity machine.
NASA Technical Reports Server (NTRS)
Salmasi, A. B. (Editor); Springett, J. C.; Sumida, J. T.; Richter, P. H.
1984-01-01
The design and implementation of the Land Mobile Satellite Service (LMSS) channel simulator as a facility for an end to end hardware simulation of the LMSS communications links, primarily with the mobile terminal is described. A number of studies are reported which show the applications of the channel simulator as a facility for validation and assessment of the LMSS design requirements and capabilities by performing quantitative measurements and qualitative audio evaluations for various link design parameters and channel impairments under simulated LMSS operating conditions. As a first application, the LMSS channel simulator was used in the evaluation of a system based on the voice processing and modulation (e.g., NBFM with 30 kHz of channel spacing and a 2 kHz rms frequency deviation for average talkers) selected for the Bell System's Advanced Mobile Phone Service (AMPS). The various details of the hardware design, qualitative audio evaluation techniques, signal to channel impairment measurement techniques, the justifications for criteria of different parameter selection in regards to the voice processing and modulation methods, and the results of a number of parametric studies are further described.
Plasma Processing of Lunar Regolith Simulant for Diverse Applications
NASA Technical Reports Server (NTRS)
Schofield, Elizabeth C.; Sen, Subhayu; O'Dell, J. Scott
2008-01-01
Versatile manufacturing technologies for extracting resources from the moon are needed to support future space missions. Of particular interest is the production of gases and metals from lunar resources for life support, propulsion, and in-space fabrication. Deposits made from lunar regolith could yield highly emissive coatings and near-net shaped parts for replacement or repair of critical components. Equally important is development of high fidelity lunar simulants for ground based validation of potential lunar surface operations. Described herein is an innovative plasma processing technique for insitu production of gases, metals, coatings, and deposits from lunar regolith, and synthesis of high fidelity lunar simulant from NASA issued lunar simulant JSC-1. Initial plasma reduction trials of JSC-1 lunar simulant have indicated production of metallic iron and magnesium. Evolution of carbon monoxide has been detected subsequent to reduction of the simulant using the plasma process. Plasma processing of the simulant has also resulted in glassy phases resembling the volcanic glass and agglutinates found in lunar regolith. Complete and partial glassy phase deposits have been obtained by varying the plasma process variables. Experimental techniques, product characterization, and process gas analysis will be discussed.
Physically Based Modeling and Simulation with Dynamic Spherical Volumetric Simplex Splines
Tan, Yunhao; Hua, Jing; Qin, Hong
2009-01-01
In this paper, we present a novel computational modeling and simulation framework based on dynamic spherical volumetric simplex splines. The framework can handle the modeling and simulation of genus-zero objects with real physical properties. In this framework, we first develop an accurate and efficient algorithm to reconstruct the high-fidelity digital model of a real-world object with spherical volumetric simplex splines which can represent with accuracy geometric, material, and other properties of the object simultaneously. With the tight coupling of Lagrangian mechanics, the dynamic volumetric simplex splines representing the object can accurately simulate its physical behavior because it can unify the geometric and material properties in the simulation. The visualization can be directly computed from the object’s geometric or physical representation based on the dynamic spherical volumetric simplex splines during simulation without interpolation or resampling. We have applied the framework for biomechanic simulation of brain deformations, such as brain shifting during the surgery and brain injury under blunt impact. We have compared our simulation results with the ground truth obtained through intra-operative magnetic resonance imaging and the real biomechanic experiments. The evaluations demonstrate the excellent performance of our new technique. PMID:20161636
Mobility based key management technique for multicast security in mobile ad hoc networks.
Madhusudhanan, B; Chitra, S; Rajan, C
2015-01-01
In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality.
ERIC Educational Resources Information Center
Davis, Laurie Laughlin; Pastor, Dena A.; Dodd, Barbara G.; Chiang, Claire; Fitzpatrick, Steven J.
2003-01-01
Examined the effectiveness of the Sympson-Hetter technique and rotated content balancing relative to no exposure control and no content rotation conditions in a computerized adaptive testing system based on the partial credit model. Simulation results show the Sympson-Hetter technique can be used with minimal impact on measurement precision,…
A technique for evaluating the application of the pin-level stuck-at fault model to VLSI circuits
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Finelli, George B.
1987-01-01
Accurate fault models are required to conduct the experiments defined in validation methodologies for highly reliable fault-tolerant computers (e.g., computers with a probability of failure of 10 to the -9 for a 10-hour mission). Described is a technique by which a researcher can evaluate the capability of the pin-level stuck-at fault model to simulate true error behavior symptoms in very large scale integrated (VLSI) digital circuits. The technique is based on a statistical comparison of the error behavior resulting from faults applied at the pin-level of and internal to a VLSI circuit. As an example of an application of the technique, the error behavior of a microprocessor simulation subjected to internal stuck-at faults is compared with the error behavior which results from pin-level stuck-at faults. The error behavior is characterized by the time between errors and the duration of errors. Based on this example data, the pin-level stuck-at fault model is found to deliver less than ideal performance. However, with respect to the class of faults which cause a system crash, the pin-level, stuck-at fault model is found to provide a good modeling capability.
Gupta, Jasmine; Nunes, Cletus; Vyas, Shyam; Jonnalagadda, Sriramakamal
2011-03-10
The objectives of this study were (i) to develop a computational model based on molecular dynamics technique to predict the miscibility of indomethacin in carriers (polyethylene oxide, glucose, and sucrose) and (ii) to experimentally verify the in silico predictions by characterizing the drug-carrier mixtures using thermoanalytical techniques. Molecular dynamics (MD) simulations were performed using the COMPASS force field, and the cohesive energy density and the solubility parameters were determined for the model compounds. The magnitude of difference in the solubility parameters of drug and carrier is indicative of their miscibility. The MD simulations predicted indomethacin to be miscible with polyethylene oxide and to be borderline miscible with sucrose and immiscible with glucose. The solubility parameter values obtained using the MD simulations values were in reasonable agreement with those calculated using group contribution methods. Differential scanning calorimetry showed melting point depression of polyethylene oxide with increasing levels of indomethacin accompanied by peak broadening, confirming miscibility. In contrast, thermal analysis of blends of indomethacin with sucrose and glucose verified general immiscibility. The findings demonstrate that molecular modeling is a powerful technique for determining the solubility parameters and predicting miscibility of pharmaceutical compounds. © 2011 American Chemical Society
Kwon, Ohin; Woo, Eung Je; Yoon, Jeong-Rock; Seo, Jin Keun
2002-02-01
We developed a new image reconstruction algorithm for magnetic resonance electrical impedance tomography (MREIT). MREIT is a new EIT imaging technique integrated into magnetic resonance imaging (MRI) system. Based on the assumption that internal current density distribution is obtained using magnetic resonance imaging (MRI) technique, the new image reconstruction algorithm called J-substitution algorithm produces cross-sectional static images of resistivity (or conductivity) distributions. Computer simulations show that the spatial resolution of resistivity image is comparable to that of MRI. MREIT provides accurate high-resolution cross-sectional resistivity images making resistivity values of various human tissues available for many biomedical applications.
Compressive self-interference Fresnel digital holography with faithful reconstruction
NASA Astrophysics Data System (ADS)
Wan, Yuhong; Man, Tianlong; Han, Ying; Zhou, Hongqiang; Wang, Dayong
2017-05-01
We developed compressive self-interference digital holographic approach that allows retrieving three-dimensional information of the spatially incoherent objects from single-shot captured hologram. The Fresnel incoherent correlation holography is combined with parallel phase-shifting technique to instantaneously obtain spatial-multiplexed phase-shifting holograms. The recording scheme is regarded as compressive forward sensing model, thus the compressive-sensing-based reconstruction algorithm is implemented to reconstruct the original object from the under sampled demultiplexed sub-holograms. The concept was verified by simulations and experiments with simulating use of the polarizer array. The proposed technique has great potential to be applied in 3D tracking of spatially incoherent samples.
Recommender engine for continuous-time quantum Monte Carlo methods
NASA Astrophysics Data System (ADS)
Huang, Li; Yang, Yi-feng; Wang, Lei
2017-03-01
Recommender systems play an essential role in the modern business world. They recommend favorable items such as books, movies, and search queries to users based on their past preferences. Applying similar ideas and techniques to Monte Carlo simulations of physical systems boosts their efficiency without sacrificing accuracy. Exploiting the quantum to classical mapping inherent in the continuous-time quantum Monte Carlo methods, we construct a classical molecular gas model to reproduce the quantum distributions. We then utilize powerful molecular simulation techniques to propose efficient quantum Monte Carlo updates. The recommender engine approach provides a general way to speed up the quantum impurity solvers.
NASA Astrophysics Data System (ADS)
Calderisi, Marco; Ulrici, Alessandro; Pigani, Laura; Secchi, Alberto; Seeber, Renato
2012-09-01
The EU FP7 project CUSTOM (Drugs and Precursor Sensing by Complementing Low Cost Multiple Techniques) aims at developing a new sensing system for the detection of drug precursors in gaseous samples, which includes an External Cavity-Quantum Cascade Laser Photo-Acoustic Sensor (EC-QCLPAS) that is in the final step of realisation. Thus, a simulation based on FT-IR literature spectra has been accomplished, where the development of a proper strategy for the design of the composition of the environment, as much as possible realistic and representative of different scenarios, is of key importance. To this aim, an approach based on the combination of signal processing and experimental design techniques has been developed. The gaseous mixtures were built by adding the considered 4 drug precursor (target) species to the gases typically found in atmosphere, taking also into account possible interfering species. These last chemicals were selected considering custom environments (20 interfering chemical species), whose concentrations have been inferred from literature data. The spectra were first denoised by means of a Fast Wavelet Transform-based algorithm; then, a procedure based on a sigmoidal transfer function was developed to multiply the pure components spectra by the respective concentration values, in a way to correctly preserve background intensity and shape, and to operate only on the absorption bands. The noise structure of the EC-QCLPAS was studied using sample spectra measured with a prototype instrument, and added to the simulated mixtures. Finally a matrix containing 5000 simulated spectra of gaseous mixtures was built up.
Conditional Random Field-Based Offline Map Matching for Indoor Environments
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-01-01
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm. PMID:27537892
Conditional Random Field-Based Offline Map Matching for Indoor Environments.
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-08-16
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm.
Advanced Navigation Strategies For Asteroid Sample Return Missions
NASA Technical Reports Server (NTRS)
Getzandanner, K.; Bauman, J.; Williams, B.; Carpenter, J.
2010-01-01
Flyby and rendezvous missions to asteroids have been accomplished using navigation techniques derived from experience gained in planetary exploration. This paper presents analysis of advanced navigation techniques required to meet unique challenges for precision navigation to acquire a sample from an asteroid and return it to Earth. These techniques rely on tracking data types such as spacecraft-based laser ranging and optical landmark tracking in addition to the traditional Earth-based Deep Space Network radio metric tracking. A systematic study of navigation strategy, including the navigation event timeline and reduction in spacecraft-asteroid relative errors, has been performed using simulation and covariance analysis on a representative mission.
NASA Astrophysics Data System (ADS)
Ghosh, B.; Hazra, S.; Haldar, N.; Roy, D.; Patra, S. N.; Swarnakar, J.; Sarkar, P. P.; Mukhopadhyay, S.
2018-03-01
Since last few decades optics has already proved its strong potentiality for conducting parallel logic, arithmetic and algebraic operations due to its super-fast speed in communication and computation. So many different logical and sequential operations using all optical frequency encoding technique have been proposed by several authors. Here, we have keened out all optical dibit representation technique, which has the advantages of high speed operation as well as reducing the bit error problem. Exploiting this phenomenon, we have proposed all optical frequency encoded dibit based XOR and XNOR logic gates using the optical switches like add/drop multiplexer (ADM) and reflected semiconductor optical amplifier (RSOA). Also the operations of these gates have been verified through proper simulation using MATLAB (R2008a).
A New Quantum Watermarking Based on Quantum Wavelet Transforms
NASA Astrophysics Data System (ADS)
Heidari, Shahrokh; Naseri, Mosayeb; Gheibi, Reza; Baghfalaki, Masoud; Rasoul Pourarian, Mohammad; Farouk, Ahmed
2017-06-01
Quantum watermarking is a technique to embed specific information, usually the owner’s identification, into quantum cover data such for copyright protection purposes. In this paper, a new scheme for quantum watermarking based on quantum wavelet transforms is proposed which includes scrambling, embedding and extracting procedures. The invisibility and robustness performances of the proposed watermarking method is confirmed by simulation technique. The invisibility of the scheme is examined by the peak-signal-to-noise ratio (PSNR) and the histogram calculation. Furthermore the robustness of the scheme is analyzed by the Bit Error Rate (BER) and the Correlation Two-Dimensional (Corr 2-D) calculation. The simulation results indicate that the proposed watermarking scheme indicate not only acceptable visual quality but also a good resistance against different types of attack. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, Iran
Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility
NASA Astrophysics Data System (ADS)
Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.
2017-12-01
The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.
NASA Astrophysics Data System (ADS)
Zarifi, Keyvan; Gershman, Alex B.
2006-12-01
We analyze the performance of two popular blind subspace-based signature waveform estimation techniques proposed by Wang and Poor and Buzzi and Poor for direct-sequence code division multiple-access (DS-CDMA) systems with unknown correlated noise. Using the first-order perturbation theory, analytical expressions for the mean-square error (MSE) of these algorithms are derived. We also obtain simple high SNR approximations of the MSE expressions which explicitly clarify how the performance of these techniques depends on the environmental parameters and how it is related to that of the conventional techniques that are based on the standard white noise assumption. Numerical examples further verify the consistency of the obtained analytical results with simulation results.
NASA Astrophysics Data System (ADS)
Gorthi, Sai Siva; Rajshekhar, Gannavarpu; Rastogi, Pramod
2010-06-01
Recently, a high-order instantaneous moments (HIM)-operator-based method was proposed for accurate phase estimation in digital holographic interferometry. The method relies on piece-wise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients from the HIM operator using single-tone frequency estimation. The work presents a comparative analysis of the performance of different single-tone frequency estimation techniques, like Fourier transform followed by optimization, estimation of signal parameters by rotational invariance technique (ESPRIT), multiple signal classification (MUSIC), and iterative frequency estimation by interpolation on Fourier coefficients (IFEIF) in HIM-operator-based methods for phase estimation. Simulation and experimental results demonstrate the potential of the IFEIF technique with respect to computational efficiency and estimation accuracy.
NASA Astrophysics Data System (ADS)
Khaki, M.; Hoteit, I.; Kuhn, M.; Awange, J.; Forootan, E.; van Dijk, A. I. J. M.; Schumacher, M.; Pattiaratchi, C.
2017-09-01
The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques. In this study, for the first time, we assess the performance of the most popular data assimilation sequential techniques for integrating GRACE TWS into the World-Wide Water Resources Assessment (W3RA) model. We implement and test stochastic and deterministic ensemble-based Kalman filters (EnKF), as well as Particle filters (PF) using two different resampling approaches of Multinomial Resampling and Systematic Resampling. These choices provide various opportunities for weighting observations and model simulations during the assimilation and also accounting for error distributions. Particularly, the deterministic EnKF is tested to avoid perturbing observations before assimilation (that is the case in an ordinary EnKF). Gaussian-based random updates in the EnKF approaches likely do not fully represent the statistical properties of the model simulations and TWS observations. Therefore, the fully non-Gaussian PF is also applied to estimate more realistic updates. Monthly GRACE TWS are assimilated into W3RA covering the entire Australia. To evaluate the filters performances and analyze their impact on model simulations, their estimates are validated by independent in-situ measurements. Our results indicate that all implemented filters improve the estimation of water storage simulations of W3RA. The best results are obtained using two versions of deterministic EnKF, i.e. the Square Root Analysis (SQRA) scheme and the Ensemble Square Root Filter (EnSRF), respectively, improving the model groundwater estimations errors by 34% and 31% compared to a model run without assimilation. Applying the PF along with Systematic Resampling successfully decreases the model estimation error by 23%.
NASA Astrophysics Data System (ADS)
Hajnayeb, Ali; Nikpour, Masood; Moradi, Shapour; Rossi, Gianluca
2018-02-01
The blade tip-timing (BTT) measurement technique is at present the most promising technique for monitoring the blades of axial turbines and aircraft engines in operating conditions. It is generally used as an alternative to strain gauges in turbine testing. By conducting a comparison with the standard methods such as those based on strain gauges, one determines that the technique is not intrusive and does not require a complicated installation process. Despite its superiority to other methods, the experimental performance analysis of a new BTT method needs a test stand that includes a reference measurement system (e.g. strain gauges equipped with telemetry or other complex optical measurement systems, like rotating laser Doppler vibrometers). In this article, a new reliable, low-cost BTT test setup is proposed for simulating and analyzing blade vibrations based on kinematic inversion. In the proposed test bench, instead of the blades vibrating, it is the BTT sensor that vibrates. The vibration of the sensor is generated by a shaker and can therefore be easily controlled in terms of frequency, amplitude and waveform shape. The amplitude of vibration excitation is measured by a simple accelerometer. After introducing the components of the simulator, the proposed test bench is used in practice to simulate both synchronous and asynchronous vibration scenarios. Then two BTT methods are used to evaluate the quality of the acquired data. The results demonstrate that the proposed setup is able to generate simulated pulse sequences which are almost the same as those generated by the conventional BTT systems installed around a bladed disk. Moreover, the test setup enables its users to evaluate BTT methods by using a limited number of sensors. This significantly reduces the total costs of the experiments.
A New Computational Technique for the Generation of Optimised Aircraft Trajectories
NASA Astrophysics Data System (ADS)
Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto
2017-12-01
A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.
NASA Technical Reports Server (NTRS)
Taylor, Brian R.; Ratnayake, Nalin A.
2010-01-01
As part of an effort to improve emissions, noise, and performance of next generation aircraft, it is expected that future aircraft will make use of distributed, multi-objective control effectors in a closed-loop flight control system. Correlation challenges associated with parameter estimation will arise with this expected aircraft configuration. Research presented in this paper focuses on addressing the correlation problem with an appropriate input design technique and validating this technique through simulation and flight test of the X-48B aircraft. The X-48B aircraft is an 8.5 percent-scale hybrid wing body aircraft demonstrator designed by The Boeing Company (Chicago, Illinois, USA), built by Cranfield Aerospace Limited (Cranfield, Bedford, United Kingdom) and flight tested at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California, USA). Based on data from flight test maneuvers performed at Dryden Flight Research Center, aerodynamic parameter estimation was performed using linear regression and output error techniques. An input design technique that uses temporal separation for de-correlation of control surfaces is proposed, and simulation and flight test results are compared with the aerodynamic database. This paper will present a method to determine individual control surface aerodynamic derivatives.
Detection and Tracking Based on a Dynamical Hierarchical Occupancy Map in Agent-Based Simulations
2008-09-01
describes various techniques for targeting with probabilty reasoning. Advantages and disadvantages of the different methods will be discussed...Psum. Such an algorithm could decrease the performance of the prototype itself and therefore was not considered. Probabilty over time 0 0.2 0.4 0.6
Climate change streamflow scenarios designed for critical period water resources planning studies
NASA Astrophysics Data System (ADS)
Hamlet, A. F.; Snover, A. K.; Lettenmaier, D. P.
2003-04-01
Long-range water planning in the United States is usually conducted by individual water management agencies using a critical period planning exercise based on a particular period of the observed streamflow record and a suite of internally-developed simulation tools representing the water system. In the context of planning for climate change, such an approach is flawed in that it assumes that the future climate will be like the historic record. Although more sophisticated planning methods will probably be required as time goes on, a short term strategy for incorporating climate uncertainty into long-range water planning as soon as possible is to create alternate inputs to existing planning methods that account for climate uncertainty as it affects both supply and demand. We describe a straight-forward technique for constructing streamflow scenarios based on the historic record that include the broad-based effects of changed regional climate simulated by several global climate models (GCMs). The streamflow scenarios are based on hydrologic simulations driven by historic climate data perturbed according to regional climate signals from four GCMs using the simple "delta" method. Further data processing then removes systematic hydrologic model bias using a quantile-based bias correction scheme, and lastly, the effects of random errors in the raw hydrologic simulations are removed. These techniques produce streamflow scenarios that are consistent in time and space with the historic streamflow record while incorporating fundamental changes in temperature and precipitation from the GCM scenarios. Planning model simulations based on these climate change streamflow scenarios can therefore be compared directly to planning model simulations based on the historic record of streamflows to help planners understand the potential impacts of climate uncertainty. The methods are currently being tested and refined in two large-scale planning exercises currently being conducted in the Pacific Northwest (PNW) region of the US, and the resulting streamflow scenarios will be made freely available on the internet for a large number of sites in the PNW to help defray the costs of including climate change information in other studies.
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
NASA Astrophysics Data System (ADS)
Sembiring, L.; Van Ormondt, M.; Van Dongeren, A. R.; Roelvink, J. A.
2017-07-01
Rip currents are one of the most dangerous coastal hazards for swimmers. In order to minimize the risk, a coastal operational-process based-model system can be utilized in order to provide forecast of nearshore waves and currents that may endanger beach goers. In this paper, an operational model for rip current prediction by utilizing nearshore bathymetry obtained from video image technique is demonstrated. For the nearshore scale model, XBeach1 is used with which tidal currents, wave induced currents (including the effect of the wave groups) can be simulated simultaneously. Up-to-date bathymetry will be obtained using video images technique, cBathy 2. The system will be tested for the Egmond aan Zee beach, located in the northern part of the Dutch coastline. This paper will test the applicability of bathymetry obtained from video technique to be used as input for the numerical modelling system by comparing simulation results using surveyed bathymetry and model results using video bathymetry. Results show that the video technique is able to produce bathymetry converging towards the ground truth observations. This bathymetry validation will be followed by an example of operational forecasting type of simulation on predicting rip currents. Rip currents flow fields simulated over measured and modeled bathymetries are compared in order to assess the performance of the proposed forecast system.
NASA Astrophysics Data System (ADS)
Lin, Pei-Chun; Yu, Chun-Chang; Chen, Charlie Chung-Ping
2015-01-01
As one of the critical stages of a very large scale integration fabrication process, postexposure bake (PEB) plays a crucial role in determining the final three-dimensional (3-D) profiles and lessening the standing wave effects. However, the full 3-D chemically amplified resist simulation is not widely adopted during the postlayout optimization due to the long run-time and huge memory usage. An efficient simulation method is proposed to simulate the PEB while considering standing wave effects and resolution enhancement techniques, such as source mask optimization and subresolution assist features based on the Sylvester equation and Abbe-principal component analysis method. Simulation results show that our algorithm is 20× faster than the conventional Gaussian convolution method.
Pilot estimates of glidepath and aim point during simulated landing approaches
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.
1981-01-01
Pilot perceptions of glidepath angle and aim point were measured during simulated landings. A fixed-base cockpit simulator was used with video recordings of simulated landing approaches shown on a video projector. Pilots estimated the magnitudes of approach errors during observation without attempting to make corrections. Pilots estimated glidepath angular errors well, but had difficulty estimating aim-point errors. The data make plausible the hypothesis that pilots are little concerned with aim point during most of an approach, concentrating instead on keeping close to the nominal glidepath and trusting this technique to guide them to the proper touchdown point.
Evaporation kinetics of Mg2SiO4 crystals and melts from molecular dynamics simulations
NASA Technical Reports Server (NTRS)
Kubicki, J. D.; Stolper, E. M.
1993-01-01
Computer simulations based on the molecular dynamics (MD) technique were used to study the mechanisms and kinetics of free evaporation from crystalline and molten forsterite (i.e., Mg2SiO4) on an atomic level. The interatomic potential employed for these simulations reproduces the energetics of bonding in forsterite and in gas-phase MgO and SiO2 reasonably accurately. Results of the simulation include predicted evaporation rates, diffusion rates, and reaction mechanisms for Mg2SiO4(s or l) yields 2Mg(g) + 20(g) + SiO2(g).
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
Innovative real CSF leak simulation model for rhinology training: human cadaveric design.
AlQahtani, Abdulaziz A; Albathi, Abeer A; Alhammad, Othman M; Alrabie, Abdulkarim S
2018-04-01
To study the feasibility of designing a human cadaveric simulation model of real CSF leak for rhinology training. The laboratory investigation took place at the surgical academic center of Prince Sultan Military Medical City between 2016 and 2017. Five heads of human cadaveric specimens were cannulated into the intradural space through two frontal bone holes. Fluorescein-dyed fluid was injected intracranialy, then endoscopic endonasal iatrogenic skull base defect was created with observation of fluid leak, followed by skull base reconstruction. The outcome measures included subjective assessment of integrity of the design, the ability of creating real CSF leak in multiple site of skull base and the possibility of watertight closure by various surgical techniques. The fluid filled the intradural space in all specimens without spontaneous leak from skull base or extra sinus areas. Successfully, we demonstrated fluid leak from all areas after iatrogenic defect in the cribriform plate, fovea ethmoidalis, planum sphenoidale sellar and clival regions. Watertight closure was achieved in all defects using different reconstruction techniques (overly, underlay and gasket seal closure). The design is simulating the real patient with CSF leak. It has potential in the learning process of acquiring and maintaining the surgical skills of skull base reconstruction before direct involvement of the patient. This model needs further evaluation and competence measurement as training tools in rhinology training.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy
NASA Astrophysics Data System (ADS)
Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.
2016-12-01
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.
Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M
2016-12-07
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods
NASA Astrophysics Data System (ADS)
Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.
2001-01-01
In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.
Development of a Aerothermoelastic-Acoustics Simulation Capability of Flight Vehicles
NASA Technical Reports Server (NTRS)
Gupta, K. K.; Choi, S. B.; Ibrahim, A.
2010-01-01
A novel numerical, finite element based analysis methodology is presented in this paper suitable for accurate and efficient simulation of practical, complex flight vehicles. An associated computer code, developed in this connection, is also described in some detail. Thermal effects of high speed flow obtained from a heat conduction analysis are incorporated in the modal analysis which in turn affects the unsteady flow arising out of interaction of elastic structures with the air. Numerical examples pertaining to representative problems are given in much detail testifying to the efficacy of the advocated techniques. This is a unique implementation of temperature effects in a finite element CFD based multidisciplinary simulation analysis capability involving large scale computations.
Feasibility study for wax deposition imaging in oil pipelines by PGNAA technique.
Cheng, Can; Jia, Wenbao; Hei, Daqian; Wei, Zhiyong; Wang, Hongtao
2017-10-01
Wax deposition in pipelines is a crucial problem in the oil industry. A method based on the prompt gamma-ray neutron activation analysis technique was applied to reconstruct the image of wax deposition in oil pipelines. The 2.223MeV hydrogen capture gamma rays were used to reconstruct the wax deposition image. To validate the method, both MCNP simulation and experiments were performed for wax deposited with a maximum thickness of 20cm. The performance of the method was simulated using the MCNP code. The experiment was conducted with a 252 Cf neutron source and a LaBr 3 : Ce detector. A good correspondence between the simulations and the experiments was observed. The results obtained indicate that the present approach is efficient for wax deposition imaging in oil pipelines. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bi, L.
2016-12-01
Atmospheric remote sensing based on the Lidar technique fundamentally relies on knowledge of the backscattering of light by particulate matters in the atmosphere. This talk starts with a review of the current capabilities of electromagnetic wave scattering simulations to determine the backscattering optical properties of irregular particles, such as the backscatterer and depolarization ratio. This will be followed by a discussion of possible pitfalls in the relevant simulations. The talk will then be concluded with reports on the latest advancements in computational techniques. In addition, we summarize the laws of the backscattering optical properties of aerosols with respect to particle geometries, particle sizes, and mixing rules. These advancements will be applied to the analysis of the Lidar observation data to reveal the state and possible microphysical processes of various aerosols.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundstrom, B.; Shirazi, M.; Coddington, M.
2013-02-01
This poster describes a Grid Interconnection System Evaluator (GISE) that leverages hardware-in-the-loop (HIL) simulation techniques to rapidly evaluate the grid interconnection standard conformance of an ICS according to the procedures in IEEE Std 1547.1TM. The architecture and test sequencing of this evaluation tool, along with a set of representative ICS test results from three different photovoltaic (PV) inverters, are presented. The GISE adds to the National Renewable Energy Laboratory's (NREL) evaluation platform that now allows for rapid development of ICS control algorithms using controller HIL (CHIL) techniques, the ability to test the dc input characteristics of PV-based ICSs through themore » use of a PV simulator capable of simulating real-world dynamics using power HIL (PHIL), and evaluation of ICS grid interconnection conformance.« less
Coarse Grid CFD for underresolved simulation
NASA Astrophysics Data System (ADS)
Class, Andreas G.; Viellieber, Mathias O.; Himmel, Steffen R.
2010-11-01
CFD simulation of the complete reactor core of a nuclear power plant requires exceedingly huge computational resources so that this crude power approach has not been pursued yet. The traditional approach is 1D subchannel analysis employing calibrated transport models. Coarse grid CFD is an attractive alternative technique based on strongly under-resolved CFD and the inviscid Euler equations. Obviously, using inviscid equations and coarse grids does not resolve all the physics requiring additional volumetric source terms modelling viscosity and other sub-grid effects. The source terms are implemented via correlations derived from fully resolved representative simulations which can be tabulated or computed on the fly. The technique is demonstrated for a Carnot diffusor and a wire-wrap fuel assembly [1]. [4pt] [1] Himmel, S.R. phd thesis, Stuttgart University, Germany 2009, http://bibliothek.fzk.de/zb/berichte/FZKA7468.pdf
Simulated annealing in orbital flight planning
NASA Technical Reports Server (NTRS)
Soller, Jeffrey
1990-01-01
Simulated annealing is used to solve a minimum fuel trajectory problem in the space station environment. The environment is unique because the space station will define the first true multivehicle environment in space. The optimization yields surfaces which are potentially complex, with multiple local minima. Because of the likelihood of these local minima, descent techniques are unable to offer robust solutions. Other deterministic optimization techniques were explored without success. The simulated annealing optimization is capable of identifying a minimum-fuel, two-burn trajectory subject to four constraints. Furthermore, the computational efforts involved in the optimization are such that missions could be planned on board the space station. Potential applications could include the on-site planning of rendezvous with a target craft of the emergency rescue of an astronaut. Future research will include multiwaypoint maneuvers, using a knowledge base to guide the optimization.
Ogawara, R; Ishikawa, M
2016-07-01
The anode pulse of a photomultiplier tube (PMT) coupled with a scintillator is used for pulse shape discrimination (PSD) analysis. We have developed a novel emulation technique for the PMT anode pulse based on optical photon transport and a PMT response function. The photon transport was calculated using Geant4 Monte Carlo code and the response function with a BC408 organic scintillator. The obtained percentage RMS value of the difference between the measured and simulated pulse with suitable scintillation properties using GSO:Ce (0.4, 1.0, 1.5 mol%), LaBr3:Ce and BGO scintillators were 2.41%, 2.58%, 2.16%, 2.01%, and 3.32%, respectively. The proposed technique demonstrates high reproducibility of the measured pulse and can be applied to simulation studies of various radiation measurements.
Simulation of white light generation and near light bullets using a novel numerical technique
NASA Astrophysics Data System (ADS)
Zia, Haider
2018-01-01
An accurate and efficient simulation has been devised, employing a new numerical technique to simulate the derivative generalised non-linear Schrödinger equation in all three spatial dimensions and time. The simulation models all pertinent effects such as self-steepening and plasma for the non-linear propagation of ultrafast optical radiation in bulk material. Simulation results are compared to published experimental spectral data of an example ytterbium aluminum garnet system at 3.1 μm radiation and fits to within a factor of 5. The simulation shows that there is a stability point near the end of the 2 mm crystal where a quasi-light bullet (spatial temporal soliton) is present. Within this region, the pulse is collimated at a reduced diameter (factor of ∼2) and there exists a near temporal soliton at the spatial center. The temporal intensity within this stable region is compressed by a factor of ∼4 compared to the input. This study shows that the simulation highlights new physical phenomena based on the interplay of various linear, non-linear and plasma effects that go beyond the experiment and is thus integral to achieving accurate designs of white light generation systems for optical applications. An adaptive error reduction algorithm tailor made for this simulation will also be presented in appendix.
Learner-Adaptive Educational Technology for Simulation in Healthcare: Foundations and Opportunities.
Lineberry, Matthew; Dev, Parvati; Lane, H Chad; Talbot, Thomas B
2018-06-01
Despite evidence that learners vary greatly in their learning needs, practical constraints tend to favor ''one-size-fits-all'' educational approaches, in simulation-based education as elsewhere. Adaptive educational technologies - devices and/or software applications that capture and analyze relevant data about learners to select and present individually tailored learning stimuli - are a promising aid in learners' and educators' efforts to provide learning experiences that meet individual needs. In this article, we summarize and build upon the 2017 Society for Simulation in Healthcare Research Summit panel discussion on adaptive learning. First, we consider the role of adaptivity in learning broadly. We then outline the basic functions that adaptive learning technologies must implement and the unique affordances and challenges of technology-based approaches for those functions, sharing an illustrative example from healthcare simulation. Finally, we consider future directions for accelerating research, development, and deployment of effective adaptive educational technology and techniques in healthcare simulation.
Wu, Tianmin; Yang, Lijiang; Zhang, Ruiting; Shao, Qiang; Zhuang, Wei
2013-07-25
We simulated the equilibrium isotope-edited FTIR and 2DIR spectra of a β-hairpin peptide trpzip2 at a series of temperatures. The simulation was based on the configuration distributions generated using the GB(OBC) implicit solvent model and the integrated tempering sampling (ITS) technique. A soaking procedure was adapted to generate the peptide in explicit solvent configurations for the spectroscopy calculations. The nonlinear exciton propagation (NEP) method was then used to calculate the spectra. Agreeing with the experiments, the intensities and ellipticities of the isotope-shifted peaks in our simulated signals have the site-specific temperature dependences, which suggest the inhomogeneous local thermal stabilities along the peptide chain. Our simulation thus proposes a cost-effective means to understand a peptide's conformational change and related IR spectra across its thermal unfolding transition.
Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects
Lambers, Martin; Kolb, Andreas
2017-01-01
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data. PMID:29271888
Methods for simulation-based analysis of fluid-structure interaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barone, Matthew Franklin; Payne, Jeffrey L.
2005-10-01
Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less
Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.
Bulczak, David; Lambers, Martin; Kolb, Andreas
2017-12-22
In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.
Two-way coupled SPH and particle level set fluid simulation.
Losasso, Frank; Talton, Jerry; Kwatra, Nipun; Fedkiw, Ronald
2008-01-01
Grid-based methods have difficulty resolving features on or below the scale of the underlying grid. Although adaptive methods (e.g. RLE, octrees) can alleviate this to some degree, separate techniques are still required for simulating small-scale phenomena such as spray and foam, especially since these more diffuse materials typically behave quite differently than their denser counterparts. In this paper, we propose a two-way coupled simulation framework that uses the particle level set method to efficiently model dense liquid volumes and a smoothed particle hydrodynamics (SPH) method to simulate diffuse regions such as sprays. Our novel SPH method allows us to simulate both dense and diffuse water volumes, fully incorporates the particles that are automatically generated by the particle level set method in under-resolved regions, and allows for two way mixing between dense SPH volumes and grid-based liquid representations.
An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.
Nguyen, Ngan; Watson, William D; Dominguez, Edward
2016-01-01
Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4) it provides formative and constructive feedback to bridge the gap between the learners' KSAs and the targeted KSAs. The EBAT methodology guides the design of simulation that incorporates these 4 features and, thus, enhances training effectiveness with simulation. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Schweier, C.; Markus, M.; Steinle, E.
2004-04-01
Catastrophic events like strong earthquakes can cause big losses in life and economic values. An increase in the efficiency of reconnaissance techniques could help to reduce the losses in life as many victims die after and not during the event. A basic prerequisite to improve the rescue teams' work is an improved planning of the measures. This can only be done on the basis of reliable and detailed information about the actual situation in the affected regions. Therefore, a bundle of projects at Karlsruhe university aim at the development of a tool for fast information retrieval after strong earthquakes. The focus is on urban areas as the most losses occur there. In this paper the approach for a damage analysis of buildings will be presented. It consists of an automatic methodology to model buildings in three dimensions, a comparison of pre- and post-event models to detect changes and a subsequent classification of the changes into damage types. The process is based on information extraction from airborne laserscanning data, i.e. digital surface models (DSM) acquired through scanning of an area with pulsed laser light. To date, there are no laserscanning derived DSMs available to the authors that were taken of areas that suffered damages from earthquakes. Therefore, it was necessary to simulate such data for the development of the damage detection methodology. In this paper two different methodologies used for simulating the data will be presented. The first method is to create CAD models of undamaged buildings based on their construction plans and alter them artificially in such a way as if they had suffered serious damage. Then, a laserscanning data set is simulated based on these models which can be compared with real laserscanning data acquired of the buildings (in intact state). The other approach is to use measurements of actual damaged buildings and simulate their intact state. It is possible to model the geometrical structure of these damaged buildings based on digital photography taken after the event by evaluating the images with photogrammetrical methods. The intact state of the buildings is simulated based on on-site investigations, and finally laserscanning data are simulated for both states.
Scenario management and automated scenario generation
NASA Astrophysics Data System (ADS)
McKeever, William; Gilmour, Duane; Lehman, Lynn; Stirtzinger, Anthony; Krause, Lee
2006-05-01
The military planning process utilizes simulation to determine the appropriate course of action (COA) that will achieve a campaign end state. However, due to the difficulty in developing and generating simulation level COAs, only a few COAs are simulated. This may have been appropriate for traditional conflicts but the evolution of warfare from attrition based to effects based strategies, as well as the complexities of 4 th generation warfare and asymmetric adversaries have placed additional demands on military planners and simulation. To keep pace with this dynamic, changing environment, planners must be able to perform continuous, multiple, "what-if" COA analysis. Scenario management and generation are critical elements to achieving this goal. An effects based scenario generation research project demonstrated the feasibility of automated scenario generation techniques which support multiple stove-pipe and emerging broad scope simulations. This paper will discuss a case study in which the scenario generation capability was employed to support COA simulations to identify plan effectiveness. The study demonstrated the effectiveness of using multiple simulation runs to evaluate the effectiveness of alternate COAs in achieving the overall campaign (metrics-based) objectives. The paper will discuss how scenario generation technology can be employed to allow military commanders and mission planning staff to understand the impact of command decisions on the battlespace of tomorrow.
Li, Hongyu; Walker, David; Yu, Guoyu; Sayle, Andrew; Messelink, Wilhelmus; Evans, Rob; Beaucamp, Anthony
2013-01-14
Edge mis-figure is regarded as one of the most difficult technical issues for manufacturing the segments of extremely large telescopes, which can dominate key aspects of performance. A novel edge-control technique has been developed, based on 'Precessions' polishing technique and for which accurate and stable edge tool influence functions (TIFs) are crucial. In the first paper in this series [D. Walker Opt. Express 20, 19787-19798 (2012)], multiple parameters were experimentally optimized using an extended set of experiments. The first purpose of this new work is to 'short circuit' this procedure through modeling. This also gives the prospect of optimizing local (as distinct from global) polishing for edge mis-figure, now under separate development. This paper presents a model that can predict edge TIFs based on surface-speed profiles and pressure distributions over the polishing spot at the edge of the part, the latter calculated by finite element analysis and verified by direct force measurement. This paper also presents a hybrid-measurement method for edge TIFs to verify the simulation results. Experimental and simulation results show good agreement.
Design of high-fidelity haptic display for one-dimensional force reflection applications
NASA Astrophysics Data System (ADS)
Gillespie, Brent; Rosenberg, Louis B.
1995-12-01
This paper discusses the development of a virtual reality platform for the simulation of medical procedures which involve needle insertion into human tissue. The paper's focus is the hardware and software requirements for haptic display of a particular medical procedure known as epidural analgesia. To perform this delicate manual procedure, an anesthesiologist must carefully guide a needle through various layers of tissue using only haptic cues for guidance. As a simplifying aspect for the simulator design, all motions and forces involved in the task occur along a fixed line once insertion begins. To create a haptic representation of this procedure, we have explored both physical modeling and perceptual modeling techniques. A preliminary physical model was built based on CT-scan data of the operative site. A preliminary perceptual model was built based on current training techniques for the procedure provided by a skilled instructor. We compare and contrast these two modeling methods and discuss the implications of each. We select and defend the perceptual model as a superior approach for the epidural analgesia simulator.
NASA Astrophysics Data System (ADS)
Koziel, Slawomir; Bekasiewicz, Adrian
2016-10-01
Multi-objective optimization of antenna structures is a challenging task owing to the high computational cost of evaluating the design objectives as well as the large number of adjustable parameters. Design speed-up can be achieved by means of surrogate-based optimization techniques. In particular, a combination of variable-fidelity electromagnetic (EM) simulations, design space reduction techniques, response surface approximation models and design refinement methods permits identification of the Pareto-optimal set of designs within a reasonable timeframe. Here, a study concerning the scalability of surrogate-assisted multi-objective antenna design is carried out based on a set of benchmark problems, with the dimensionality of the design space ranging from six to 24 and a CPU cost of the EM antenna model from 10 to 20 min per simulation. Numerical results indicate that the computational overhead of the design process increases more or less quadratically with the number of adjustable geometric parameters of the antenna structure at hand, which is a promising result from the point of view of handling even more complex problems.
User's guide to the Reliability Estimation System Testbed (REST)
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam
1992-01-01
The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.
A Parameter Tuning Scheme of Sea-ice Model Based on Automatic Differentiation Technique
NASA Astrophysics Data System (ADS)
Kim, J. G.; Hovland, P. D.
2001-05-01
Automatic diferentiation (AD) technique was used to illustrate a new approach for parameter tuning scheme of an uncoupled sea-ice model. Atmospheric forcing field of 1992 obtained from NCEP data was used as enforcing variables in the study. The simulation results were compared with the observed ice movement provided by the International Arctic Buoy Programme (IABP). All of the numerical experiments were based on a widely used dynamic and thermodynamic model for simulating the seasonal sea-ice chnage of the main Arctic ocean. We selected five dynamic and thermodynamic parameters for the tuning process in which the cost function defined by the norm of the difference between observed and simulated ice drift locations was minimized. The selected parameters are the air and ocean drag coefficients, the ice strength constant, the turning angle at ice-air/ocean interface, and the bulk sensible heat transfer coefficient. The drag coefficients were the major parameters to control sea-ice movement and extent. The result of the study shows that more realistic simulations of ice thickness distribution was produced by tuning the simulated ice drift trajectories. In the tuning process, the L-BFCGS-B minimization algorithm of a quasi-Newton method was used. The derivative information required in the minimization iterations was provided by the AD processed Fortran code. Compared with a conventional approach, AD generated derivative code provided fast and robust computations of derivative information.
Virtual reality neurosurgery: a simulator blueprint.
Spicer, Mark A; van Velsen, Martin; Caffrey, John P; Apuzzo, Michael L J
2004-04-01
This article details preliminary studies undertaken to integrate the most relevant advancements across multiple disciplines in an effort to construct a highly realistic neurosurgical simulator based on a distributed computer architecture. Techniques based on modified computational modeling paradigms incorporating finite element analysis are presented, as are current and projected efforts directed toward the implementation of a novel bidirectional haptic device. Patient-specific data derived from noninvasive magnetic resonance imaging sequences are used to construct a computational model of the surgical region of interest. Magnetic resonance images of the brain may be coregistered with those obtained from magnetic resonance angiography, magnetic resonance venography, and diffusion tensor imaging to formulate models of varying anatomic complexity. The majority of the computational burden is encountered in the presimulation reduction of the computational model and allows realization of the required threshold rates for the accurate and realistic representation of real-time visual animations. Intracranial neurosurgical procedures offer an ideal testing site for the development of a totally immersive virtual reality surgical simulator when compared with the simulations required in other surgical subspecialties. The material properties of the brain as well as the typically small volumes of tissue exposed in the surgical field, coupled with techniques and strategies to minimize computational demands, provide unique opportunities for the development of such a simulator. Incorporation of real-time haptic and visual feedback is approached here and likely will be accomplished soon.
A constrained modulus reconstruction technique for breast cancer assessment.
Samani, A; Bishop, J; Plewes, D B
2001-09-01
A reconstruction technique for breast tissue elasticity modulus is described. This technique assumes that the geometry of normal and suspicious tissues is available from a contrast-enhanced magnetic resonance image. Furthermore, it is assumed that the modulus is constant throughout each tissue volume. The technique, which uses quasi-static strain data, is iterative where each iteration involves modulus updating followed by stress calculation. Breast mechanical stimulation is assumed to be done by two compressional rigid plates. As a result, stress is calculated using the finite element method based on the well-controlled boundary conditions of the compression plates. Using the calculated stress and the measured strain, modulus updating is done element-by-element based on Hooke's law. Breast tissue modulus reconstruction using simulated data and phantom modulus reconstruction using experimental data indicate that the technique is robust.
Realistic natural atmospheric phenomena and weather effects for interactive virtual environments
NASA Astrophysics Data System (ADS)
McLoughlin, Leigh
Clouds and the weather are important aspects of any natural outdoor scene, but existing dynamic techniques within computer graphics only offer the simplest of cloud representations. The problem that this work looks to address is how to provide a means of simulating clouds and weather features such as precipitation, that are suitable for virtual environments. Techniques for cloud simulation are available within the area of meteorology, but numerical weather prediction systems are computationally expensive, give more numerical accuracy than we require for graphics and are restricted to the laws of physics. Within computer graphics, we often need to direct and adjust physical features or to bend reality to meet artistic goals, which is a key difference between the subjects of computer graphics and physical science. Pure physically-based simulations, however, evolve their solutions according to pre-set rules and are notoriously difficult to control. The challenge then is for the solution to be computationally lightweight and able to be directed in some measure while at the same time producing believable results. This work presents a lightweight physically-based cloud simulation scheme that simulates the dynamic properties of cloud formation and weather effects. The system simulates water vapour, cloud water, cloud ice, rain, snow and hail. The water model incorporates control parameters and the cloud model uses an arbitrary vertical temperature profile, with a tool described to allow the user to define this. The result of this work is that clouds can now be simulated in near real-time complete with precipitation. The temperature profile and tool then provide a means of directing the resulting formation..
A Global System for Transportation Simulation and Visualization in Emergency Evacuation Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Wei; Liu, Cheng; Thomas, Neil
2015-01-01
Simulation-based studies are frequently used for evacuation planning and decision making processes. Given the transportation systems complexity and data availability, most evacuation simulation models focus on certain geographic areas. With routine improvement of OpenStreetMap road networks and LandScanTM global population distribution data, we present WWEE, a uniform system for world-wide emergency evacuation simulations. WWEE uses unified data structure for simulation inputs. It also integrates a super-node trip distribution model as the default simulation parameter to improve the system computational performance. Two levels of visualization tools are implemented for evacuation performance analysis, including link-based macroscopic visualization and vehicle-based microscopic visualization. Formore » left-hand and right-hand traffic patterns in different countries, the authors propose a mirror technique to experiment with both scenarios without significantly changing traffic simulation models. Ten cities in US, Europe, Middle East, and Asia are modeled for demonstration. With default traffic simulation models for fast and easy-to-use evacuation estimation and visualization, WWEE also retains the capability of interactive operation for users to adopt customized traffic simulation models. For the first time, WWEE provides a unified platform for global evacuation researchers to estimate and visualize their strategies performance of transportation systems under evacuation scenarios.« less
Space Simulation, 7th. [facilities and testing techniques
NASA Technical Reports Server (NTRS)
1973-01-01
Space simulation facilities and techniques are outlined that encompass thermal scale modeling, computerized simulations, reentry materials, spacecraft contamination, solar simulation, vacuum tests, and heat transfer studies.
Structure refinement of membrane proteins via molecular dynamics simulations.
Dutagaci, Bercem; Heo, Lim; Feig, Michael
2018-07-01
A refinement protocol based on physics-based techniques established for water soluble proteins is tested for membrane protein structures. Initial structures were generated by homology modeling and sampled via molecular dynamics simulations in explicit lipid bilayer and aqueous solvent systems. Snapshots from the simulations were selected based on scoring with either knowledge-based or implicit membrane-based scoring functions and averaged to obtain refined models. The protocol resulted in consistent and significant refinement of the membrane protein structures similar to the performance of refinement methods for soluble proteins. Refinement success was similar between sampling in the presence of lipid bilayers and aqueous solvent but the presence of lipid bilayers may benefit the improvement of lipid-facing residues. Scoring with knowledge-based functions (DFIRE and RWplus) was found to be as good as scoring using implicit membrane-based scoring functions suggesting that differences in internal packing is more important than orientations relative to the membrane during the refinement of membrane protein homology models. © 2018 Wiley Periodicals, Inc.
Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea
2015-09-01
The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less
ERIC Educational Resources Information Center
CRAWFORD, MEREDITH P.
OPEN AND CLOSED LOOP SIMULATION IS DISCUSSED FROM THE VIEWPOINT OF RESEARCH AND DEVELOPMENT IN TRAINING TECHNIQUES. AREAS DISCUSSED INCLUDE--(1) OPEN-LOOP ENVIRONMENTAL SIMULATION, (2) SIMULATION NOT INVOLVING PEOPLE, (3) ANALYSIS OF OCCUPATIONS, (4) SIMULATION FOR TRAINING, (5) REAL-SIZE SYSTEM SIMULATION, (6) TECHNIQUES OF MINIATURIZATION, AND…
A cavitation model based on Eulerian stochastic fields
NASA Astrophysics Data System (ADS)
Magagnato, F.; Dumond, J.
2013-12-01
Non-linear phenomena can often be described using probability density functions (pdf) and pdf transport models. Traditionally the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian "particles" or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and in particular to cavitating flow. To validate the proposed stochastic-field cavitation model, two applications are considered. Firstly, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.
Yue, Dan; Nie, Haitao; Li, Ye; Ying, Changsheng
2018-03-01
Wavefront sensorless (WFSless) adaptive optics (AO) systems have been widely studied in recent years. To reach optimum results, such systems require an efficient correction method. This paper presents a fast wavefront correction approach for a WFSless AO system mainly based on the linear phase diversity (PD) technique. The fast closed-loop control algorithm is set up based on the linear relationship between the drive voltage of the deformable mirror (DM) and the far-field images of the system, which is obtained through the linear PD algorithm combined with the influence function of the DM. A large number of phase screens under different turbulence strengths are simulated to test the performance of the proposed method. The numerical simulation results show that the method has fast convergence rate and strong correction ability, a few correction times can achieve good correction results, and can effectively improve the imaging quality of the system while needing fewer measurements of CCD data.
NASA Astrophysics Data System (ADS)
Ahmadian, A.; Ismail, F.; Salahshour, S.; Baleanu, D.; Ghaemi, F.
2017-12-01
The analysis of the behaviors of physical phenomena is important to discover significant features of the character and the structure of mathematical models. Frequently the unknown parameters involve in the models are assumed to be unvarying over time. In reality, some of them are uncertain and implicitly depend on several factors. In this study, to consider such uncertainty in variables of the models, they are characterized based on the fuzzy notion. We propose here a new model based on fractional calculus to deal with the Kelvin-Voigt (KV) equation and non-Newtonian fluid behavior model with fuzzy parameters. A new and accurate numerical algorithm using a spectral tau technique based on the generalized fractional Legendre polynomials (GFLPs) is developed to solve those problems under uncertainty. Numerical simulations are carried out and the analysis of the results highlights the significant features of the new technique in comparison with the previous findings. A detailed error analysis is also carried out and discussed.
NASA Astrophysics Data System (ADS)
Bashirzadeh, Milad
This study examines microstructural-based mechanical properties of Al-Cu composite deposited by cold spraying and wire arc sprayed nickel-based alloy 625 coating using numerical modeling and experimental techniques. The microhardness and elastic modulus of samples were determined using the Knoop hardness technique. Hardness in both transverse and longitudinal directions on the sample cross-sections has been measured. An image-based finite element simulation algorithm was employed to determine the mechanical properties through an inverse analysis. In addition mechanical tests including, tensile, bending, and nano-indentation tests were performed on alloy 625 wire arc sprayed samples. Overall, results from the experimental tests are in relatively good agreement for deposited Al-Cu composites and alloy 625 coating. However, results obtained from numerical simulation are significantly higher in value than experimentally obtained results. Examination and comparison of the results are strong indications of the influence of microstructure characteristics on the mechanical properties of thermally spray deposited coatings.
On the usage of ultrasound computational models for decision making under ambiguity
NASA Astrophysics Data System (ADS)
Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron
2018-04-01
Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.
Plazinska, Anita; Plazinski, Wojciech; Jozwiak, Krzysztof
2014-04-30
The computational approach applicable for the molecular dynamics (MD)-based techniques is proposed to predict the ligand-protein binding affinities dependent on the ligand stereochemistry. All possible stereoconfigurations are expressed in terms of one set of force-field parameters [stereoconfiguration-independent potential (SIP)], which allows for calculating all relative free energies by only single simulation. SIP can be used for studying diverse, stereoconfiguration-dependent phenomena by means of various computational techniques of enhanced sampling. The method has been successfully tested on the β2-adrenergic receptor (β2-AR) binding the four fenoterol stereoisomers by both metadynamics simulations and replica-exchange MD. Both the methods gave very similar results, fully confirming the presence of stereoselective effects in the fenoterol-β2-AR interactions. However, the metadynamics-based approach offered much better efficiency of sampling which allows for significant reduction of the unphysical region in SIP. Copyright © 2014 Wiley Periodicals, Inc.
A hybrid experimental-numerical technique for determining 3D velocity fields from planar 2D PIV data
NASA Astrophysics Data System (ADS)
Eden, A.; Sigurdson, M.; Mezić, I.; Meinhart, C. D.
2016-09-01
Knowledge of 3D, three component velocity fields is central to the understanding and development of effective microfluidic devices for lab-on-chip mixing applications. In this paper we present a hybrid experimental-numerical method for the generation of 3D flow information from 2D particle image velocimetry (PIV) experimental data and finite element simulations of an alternating current electrothermal (ACET) micromixer. A numerical least-squares optimization algorithm is applied to a theory-based 3D multiphysics simulation in conjunction with 2D PIV data to generate an improved estimation of the steady state velocity field. This 3D velocity field can be used to assess mixing phenomena more accurately than would be possible through simulation alone. Our technique can also be used to estimate uncertain quantities in experimental situations by fitting the gathered field data to a simulated physical model. The optimization algorithm reduced the root-mean-squared difference between the experimental and simulated velocity fields in the target region by more than a factor of 4, resulting in an average error less than 12% of the average velocity magnitude.
Gain in computational efficiency by vectorization in the dynamic simulation of multi-body systems
NASA Technical Reports Server (NTRS)
Amirouche, F. M. L.; Shareef, N. H.
1991-01-01
An improved technique for the identification and extraction of the exact quantities associated with the degrees of freedom at the element as well as the flexible body level is presented. It is implemented in the dynamic equations of motions based on the recursive formulation of Kane et al. (1987) and presented in a matrix form, integrating the concepts of strain energy, the finite-element approach, modal analysis, and reduction of equations. This technique eliminates the CPU intensive matrix multiplication operations in the code's hot spots for the dynamic simulation of the interconnected rigid and flexible bodies. A study of a simple robot with flexible links is presented by comparing the execution times on a scalar machine and a vector-processor with and without vector options. Performance figures demonstrating the substantial gains achieved by the technique are plotted.
Development of a technique for inflight jet noise simulation. I, II
NASA Technical Reports Server (NTRS)
Clapper, W. S.; Stringas, E. J.; Mani, R.; Banerian, G.
1976-01-01
Several possible noise simulation techniques were evaluated, including closed circuit wind tunnels, free jets, rocket sleds and high speed trains. The free jet technique was selected for demonstration and verification. The first paper describes the selection and development of the technique and presents results for simulation and in-flight tests of the Learjet, F106, and Bertin Aerotrain. The second presents a theoretical study relating the two sets of noise signatures. It is concluded that the free jet simulation technique provides a satisfactory assessment of in-flight noise.
1985-05-30
consisting of quarterwave layers by detecting the -- extrema of transmission or reflectance at a particular wavelength. This method is extremely stable for the...technique, which is based on an envelope method , and gives some experimental *results. L"( iL -2- I. Introduction The refractive index and the...constants determination :ecnnique by computer simulation, we have applied the method to various layers of titanium dioxide. This technique can then
NASA Astrophysics Data System (ADS)
Gholizadeh Doonechaly, N.; Rahman, S. S.
2012-05-01
Simulation of naturally fractured reservoirs offers significant challenges due to the lack of a methodology that can utilize field data. To date several methods have been proposed by authors to characterize naturally fractured reservoirs. Among them is the unfolding/folding method which offers some degree of accuracy in estimating the probability of the existence of fractures in a reservoir. Also there are statistical approaches which integrate all levels of field data to simulate the fracture network. This approach, however, is dependent on the availability of data sources, such as seismic attributes, core descriptions, well logs, etc. which often make it difficult to obtain field wide. In this study a hybrid tectono-stochastic simulation is proposed to characterize a naturally fractured reservoir. A finite element based model is used to simulate the tectonic event of folding and unfolding of a geological structure. A nested neuro-stochastic technique is used to develop the inter-relationship between the data and at the same time it utilizes the sequential Gaussian approach to analyze field data along with fracture probability data. This approach has the ability to overcome commonly experienced discontinuity of the data in both horizontal and vertical directions. This hybrid technique is used to generate a discrete fracture network of a specific Australian gas reservoir, Palm Valley in the Northern Territory. Results of this study have significant benefit in accurately describing fluid flow simulation and well placement for maximal hydrocarbon recovery.
NASA Astrophysics Data System (ADS)
Yeh, Chi-Yuan; Tung, Chuan-Jung; Chao, Tsi-Chain; Lin, Mu-Han; Lee, Chung-Chi
2014-11-01
The purpose of this study was to examine dose distribution of a skull base tumor and surrounding critical structures in response to high dose intensity-modulated radiosurgery (IMRS) with Monte Carlo (MC) simulation using a dual resolution sandwich phantom. The measurement-based Monte Carlo (MBMC) method (Lin et al., 2009) was adopted for the study. The major components of the MBMC technique involve (1) the BEAMnrc code for beam transport through the treatment head of a Varian 21EX linear accelerator, (2) the DOSXYZnrc code for patient dose simulation and (3) an EPID-measured efficiency map which describes non-uniform fluence distribution of the IMRS treatment beam. For the simulated case, five isocentric 6 MV photon beams were designed to deliver a total dose of 1200 cGy in two fractions to the skull base tumor. A sandwich phantom for the MBMC simulation was created based on the patient's CT scan of a skull base tumor [gross tumor volume (GTV)=8.4 cm3] near the right 8th cranial nerve. The phantom, consisted of a 1.2-cm thick skull base region, had a voxel resolution of 0.05×0.05×0.1 cm3 and was sandwiched in between 0.05×0.05×0.3 cm3 slices of a head phantom. A coarser 0.2×0.2×0.3 cm3 single resolution (SR) phantom was also created for comparison with the sandwich phantom. A particle history of 3×108 for each beam was used for simulations of both the SR and the sandwich phantoms to achieve a statistical uncertainty of <2%. Our study showed that the planning target volume (PTV) receiving at least 95% of the prescribed dose (VPTV95) was 96.9%, 96.7% and 99.9% for the TPS, SR, and sandwich phantom, respectively. The maximum and mean doses to large organs such as the PTV, brain stem, and parotid gland for the TPS, SR and sandwich MC simulations did not show any significant difference; however, significant dose differences were observed for very small structures like the right 8th cranial nerve, right cochlea, right malleus and right semicircular canal. Dose volume histogram (DVH) analyses revealed much smoother DVH curves for the dual resolution sandwich phantom when compared to the SR phantom. In conclusion, MBMC simulations using a dual resolution sandwich phantom improved simulation spatial resolution for skull base IMRS therapy. More detailed dose analyses for small critical structures can be made available to help in clinical judgment.
Computer simulation of surface and film processes
NASA Technical Reports Server (NTRS)
Tiller, W. A.; Halicioglu, M. T.
1984-01-01
All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.
NASA Technical Reports Server (NTRS)
Dermanis, A.
1977-01-01
The possibility of recovering earth rotation and network geometry (baseline) parameters are emphasized. The numerical simulated experiments performed are set up in an environment where station coordinates vary with respect to inertial space according to a simulated earth rotation model similar to the actual but unknown rotation of the earth. The basic technique of VLBI and its mathematical model are presented. The parametrization of earth rotation chosen is described and the resulting model is linearized. A simple analysis of the geometry of the observations leads to some useful hints on achieving maximum sensitivity of the observations with respect to the parameters considered. The basic philosophy for the simulation of data and their analysis through standard least squares adjustment techniques is presented. A number of characteristic network designs based on present and candidate station locations are chosen. The results of the simulations for each design are presented together with a summary of the conclusions.
NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding
Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo
2016-01-01
Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280
NASA Astrophysics Data System (ADS)
Diaz, J.; Egaña, J. M.; Viñolas, J.
2006-11-01
Low-frequency broadband noise generated on a railway vehicle by the wheel-rail interaction could be a big annoyance for passengers in sleeping cars. Low-frequency acoustic radiation is extremely difficult to attenuate by using passive devices. In this article, an active noise control (ANC) technique has been proposed for this purpose. A three-dimensional cabin was built in the laboratory to carry out the experiments. The proposed scheme is based on a Filtered-X Least Mean Square (FXLMS) control algorithm, particularised for a virtual-microphone technique. Control algorithms were designed with the Matlab-Simulink tool, and the Real Time Windows Target toolbox of Matlab was used to run in real time the ANC system. Referring to the results, different simulations and experimental performances were analysed to enlarge the silence zone around the passenger's ear zone and along the bed headboard. Attenuations of up to 20 and 15 dB(A) (re:20 μPa) were achieved at the passenger's ear in simulations and in experimental results, respectively.
Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; ...
2017-06-09
Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of “KMC stiffness” (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps / cpu-time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order tomore » achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events -- allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm designed for use in achieving and simulating steady-state conditions in KMC simulations. Lastly, as shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.« less
NASA Astrophysics Data System (ADS)
Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; Savara, Aditya
2017-10-01
Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of "KMC stiffness" (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps/CPU time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order to achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events-allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm is designed for use in achieving and simulating steady-state conditions in KMC simulations. As shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.