Sample records for simulation techniques including

  1. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  2. DIMENSIONS OF SIMULATION.

    ERIC Educational Resources Information Center

    CRAWFORD, MEREDITH P.

    OPEN AND CLOSED LOOP SIMULATION IS DISCUSSED FROM THE VIEWPOINT OF RESEARCH AND DEVELOPMENT IN TRAINING TECHNIQUES. AREAS DISCUSSED INCLUDE--(1) OPEN-LOOP ENVIRONMENTAL SIMULATION, (2) SIMULATION NOT INVOLVING PEOPLE, (3) ANALYSIS OF OCCUPATIONS, (4) SIMULATION FOR TRAINING, (5) REAL-SIZE SYSTEM SIMULATION, (6) TECHNIQUES OF MINIATURIZATION, AND…

  3. Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.

    PubMed

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew

    2017-09-01

    Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.

  4. Cost considerations in using simulations for medical training.

    PubMed

    Fletcher, J D; Wind, Alexander P

    2013-10-01

    This article reviews simulation used for medical training, techniques for assessing simulation-based training, and cost analyses that can be included in such assessments. Simulation in medical training appears to take four general forms: human actors who are taught to simulate illnesses and ailments in standardized ways; virtual patients who are generally presented via computer-controlled, multimedia displays; full-body manikins that simulate patients using electronic sensors, responders, and controls; and part-task anatomical simulations of various body parts and systems. Techniques for assessing costs include benefit-cost analysis, return on investment, and cost-effectiveness analysis. Techniques for assessing the effectiveness of simulation-based medical training include the use of transfer effectiveness ratios and incremental transfer effectiveness ratios to measure transfer of knowledge and skill provided by simulation to the performance of medical procedures. Assessment of costs and simulation effectiveness can be combined with measures of transfer using techniques such as isoperformance analysis to identify ways of minimizing costs without reducing performance effectiveness or maximizing performance without increasing costs. In sum, economic analysis must be considered in training assessments if training budgets are to compete successfully with other requirements for funding. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  5. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  6. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  7. Development of a technique for inflight jet noise simulation. I, II

    NASA Technical Reports Server (NTRS)

    Clapper, W. S.; Stringas, E. J.; Mani, R.; Banerian, G.

    1976-01-01

    Several possible noise simulation techniques were evaluated, including closed circuit wind tunnels, free jets, rocket sleds and high speed trains. The free jet technique was selected for demonstration and verification. The first paper describes the selection and development of the technique and presents results for simulation and in-flight tests of the Learjet, F106, and Bertin Aerotrain. The second presents a theoretical study relating the two sets of noise signatures. It is concluded that the free jet simulation technique provides a satisfactory assessment of in-flight noise.

  8. Simulation verification techniques study. Subsystem simulation validation techniques

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1974-01-01

    Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.

  9. Computerized Clinical Simulations.

    ERIC Educational Resources Information Center

    Reinecker, Lynn

    1985-01-01

    Describes technique involved in designing a clinical simulation problem for the allied health field of respiratory therapy; discusses the structure, content, and scoring categories of the simulation; and provides a sample program which illustrates a programming technique in BASIC, including a program listing and a sample flowchart. (MBR)

  10. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  11. Review of sonic-boom simulation devices and techniques.

    NASA Technical Reports Server (NTRS)

    Edge, P. M., Jr.; Hubbard, H. H.

    1972-01-01

    Research on aircraft-generated sonic booms has led to the development of special techniques to generate controlled sonic-boom-type disturbances without the complications and expense of supersonic flight operations. This paper contains brief descriptions of several of these techniques along with the significant hardware items involved and indicates the advantages and disadvantages of each in research applications. Included are wind tunnels, ballistic ranges, spark discharges, piston phones, shock tubes, high-speed valve systems, and shaped explosive charges. Specialized applications include sonic-boom generation and propagation studies and the responses of structures, terrain, people, and animals. Situations for which simulators are applicable are shown to include both small-scale and large-scale laboratory tests and full-scale field tests. Although no one approach to simulation is ideal, the various techniques available generally complement each other to provide desired capability for a broad range of sonic-boom studies.

  12. Spatial interpolation of forest conditions using co-conditional geostatistical simulation

    Treesearch

    H. Todd Mowrer

    2000-01-01

    In recent work the author used the geostatistical Monte Carlo technique of sequential Gaussian simulation (s.G.s.) to investigate uncertainty in a GIS analysis of potential old-growth forest areas. The current study compares this earlier technique to that of co-conditional simulation, wherein the spatial cross-correlations between variables are included. As in the...

  13. E Pluribus Analysis: Applying a Superforecasting Methodology to the Detection of Homegrown Violence

    DTIC Science & Technology

    2018-03-01

    actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique, which...actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique...PREDICTING RANDOMNESS.............................................................24 1. Using a “ Runs Test” to Determine a Temporal Pattern in Lone

  14. Acceleration techniques for dependability simulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Barnette, James David

    1995-01-01

    As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.

  15. Description of a computer program and numerical techniques for developing linear perturbation models from nonlinear systems simulations

    NASA Technical Reports Server (NTRS)

    Dieudonne, J. E.

    1978-01-01

    A numerical technique was developed which generates linear perturbation models from nonlinear aircraft vehicle simulations. The technique is very general and can be applied to simulations of any system that is described by nonlinear differential equations. The computer program used to generate these models is discussed, with emphasis placed on generation of the Jacobian matrices, calculation of the coefficients needed for solving the perturbation model, and generation of the solution of the linear differential equations. An example application of the technique to a nonlinear model of the NASA terminal configured vehicle is included.

  16. Developing integrated patient pathways using hybrid simulation

    NASA Astrophysics Data System (ADS)

    Zulkepli, Jafri; Eldabi, Tillal

    2016-10-01

    Integrated patient pathways includes several departments, i.e. healthcare which includes emergency care and inpatient ward; intermediate care which patient(s) will stay for a maximum of two weeks and at the same time be assessed by assessment team to find the most suitable care; and social care. The reason behind introducing the intermediate care in western countries was to reduce the rate of patients that stays in the hospital especially for elderly patients. This type of care setting has been considered to be set up in some other countries including Malaysia. Therefore, to assess the advantages of introducing this type of integrated healthcare setting, we suggest develop the model using simulation technique. We argue that single simulation technique is not viable enough to represent this type of patient pathways. Therefore, we suggest develop this model using hybrid techniques, i.e. System Dynamics (SD) and Discrete Event Simulation (DES). Based on hybrid model result, we argued that the result is viable to be as references for decision making process.

  17. Advanced particle-in-cell simulation techniques for modeling the Lockheed Martin Compact Fusion Reactor

    NASA Astrophysics Data System (ADS)

    Welch, Dale; Font, Gabriel; Mitchell, Robert; Rose, David

    2017-10-01

    We report on particle-in-cell developments of the study of the Compact Fusion Reactor. Millisecond, two and three-dimensional simulations (cubic meter volume) of confinement and neutral beam heating of the magnetic confinement device requires accurate representation of the complex orbits, near perfect energy conservation, and significant computational power. In order to determine initial plasma fill and neutral beam heating, these simulations include ionization, elastic and charge exchange hydrogen reactions. To this end, we are pursuing fast electromagnetic kinetic modeling algorithms including a two implicit techniques and a hybrid quasi-neutral algorithm with kinetic ions. The kinetic modeling includes use of the Poisson-corrected direct implicit, magnetic implicit, as well as second-order cloud-in-cell techniques. The hybrid algorithm, ignoring electron inertial effects, is two orders of magnitude faster than kinetic but not as accurate with respect to confinement. The advantages and disadvantages of these techniques will be presented. Funded by Lockheed Martin.

  18. Digital multishaker modal testing

    NASA Technical Reports Server (NTRS)

    Blair, M.; Craig, R. R., Jr.

    1983-01-01

    A review of several modal testing techniques is made, along with brief discussions of their advantages and limitations. A new technique is presented which overcomes many of the previous limitations. Several simulated experiments are included to verify the validity and accuracy of the new method. Conclusions are drawn from the simulation studies and recommendations for further work are presented. The complete computer code configured for the simulation study is presented.

  19. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  20. Retinal Image Simulation of Subjective Refraction Techniques.

    PubMed

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient's response-guided refraction) is the most commonly used approach. In this context, this paper's main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques--including Jackson's Cross-Cylinder test (JCC)--relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software's usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training.

  1. New simulation model of multicomponent crystal growth and inhibition.

    PubMed

    Wathen, Brent; Kuiper, Michael; Walker, Virginia; Jia, Zongchao

    2004-04-02

    We review a novel computational model for the study of crystal structures both on their own and in conjunction with inhibitor molecules. The model advances existing Monte Carlo (MC) simulation techniques by extending them from modeling 3D crystal surface patches to modeling entire 3D crystals, and by including the use of "complex" multicomponent molecules within the simulations. These advances makes it possible to incorporate the 3D shape and non-uniform surface properties of inhibitors into simulations, and to study what effect these inhibitor properties have on the growth of whole crystals containing up to tens of millions of molecules. The application of this extended MC model to the study of antifreeze proteins (AFPs) and their effects on ice formation is reported, including the success of the technique in achieving AFP-induced ice-growth inhibition with concurrent changes to ice morphology that mimic experimental results. Simulations of ice-growth inhibition suggest that the degree of inhibition afforded by an AFP is a function of its ice-binding position relative to the underlying anisotropic growth pattern of ice. This extended MC technique is applicable to other crystal and crystal-inhibitor systems, including more complex crystal systems such as clathrates.

  2. An interactive driving simulation for driver control and decision-making research

    NASA Technical Reports Server (NTRS)

    Allen, R. W.; Hogge, J. R.; Schwartz, S. H.

    1975-01-01

    Display techniques and equations of motion for a relatively simple fixed base car simulation are described. The vehicle dynamics include simplified lateral (steering) and longitudinal (speed) degrees of freedom. Several simulator tasks are described which require a combination of operator control and decision making, including response to wind gust inputs, curved roads, traffic signal lights, and obstacles. Logic circuits are used to detect speeding, running red lights, and crashes. A variety of visual and auditory cues are used to give the driver appropriate performance feedback. The simulated equations of motion are reviewed and the technique for generating the line drawing CRT roadway display is discussed. On-line measurement capabilities and experimenter control features are presented, along with previous and current research results demonstrating simulation capabilities and applications.

  3. A Simulation Study of Missing Data with Multiple Missing X's

    ERIC Educational Resources Information Center

    Rubright, Jonathan D.; Nandakumar, Ratna; Glutting, Joseph J.

    2014-01-01

    When exploring missing data techniques in a realistic scenario, the current literature is limited: most studies only consider consequences with data missing on a single variable. This simulation study compares the relative bias of two commonly used missing data techniques when data are missing on more than one variable. Factors varied include type…

  4. Distributed Simulation as a modelling tool for the development of a simulation-based training programme for cardiovascular specialties.

    PubMed

    Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando

    2017-01-01

    Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n  = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n  = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n  = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n  = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n  = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.

  5. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  6. Orbiter/payload proximity operations SES Postsim report. Lateral approach and other techniques

    NASA Technical Reports Server (NTRS)

    Olszewski, O.

    1978-01-01

    Various approach and stationkeeping simulations (proximity operations) were conducted in the Shuttle engineering simulator (SES). This simulator is the first to dynamically include the Orbiter reaction control system (RCS) plume effects on a payload being recovered after rendezvous operations. A procedure for braking, using the simultaneous firing of both jets, was evaluated and found very useful for proximity operations. However this procedure is very inefficient in the RCS usage and requires modifications to the digital autopilot (DAP) software. A new final approach, the lateral approach technique (LAT), or the momentum vector proximity approach, was also evaluated in the simulations. The LAT, which included a tailfirst approach for braking, was evaluated successfully with both inertial and gravity stabilized payloads.

  7. Assessment of simulation fidelity using measurements of piloting technique in flight

    NASA Technical Reports Server (NTRS)

    Clement, W. F.; Cleveland, W. B.; Key, D. L.

    1984-01-01

    The U.S. Army and NASA joined together on a project to conduct a systematic investigation and validation of a ground based piloted simulation of the Army/Sikorsky UH-60A helicopter. Flight testing was an integral part of the validation effort. Nap-of-the-Earth (NOE) piloting tasks which were investigated included the bob-up, the hover turn, the dash/quickstop, the sidestep, the dolphin, and the slalom. Results from the simulation indicate that the pilot's NOE task performance in the simulator is noticeably and quantifiably degraded when compared with the task performance results generated in flight test. The results of the flight test and ground based simulation experiments support a unique rationale for the assessment of simulation fidelity: flight simulation fidelity should be judged quantitatively by measuring pilot's control strategy and technique as induced by the simulator. A quantitative comparison is offered between the piloting technique observed in a flight simulator and that observed in flight test for the same tasks performed by the same pilots.

  8. Accurately modeling Gaussian beam propagation in the context of Monte Carlo techniques

    NASA Astrophysics Data System (ADS)

    Hokr, Brett H.; Winblad, Aidan; Bixler, Joel N.; Elpers, Gabriel; Zollars, Byron; Scully, Marlan O.; Yakovlev, Vladislav V.; Thomas, Robert J.

    2016-03-01

    Monte Carlo simulations are widely considered to be the gold standard for studying the propagation of light in turbid media. However, traditional Monte Carlo methods fail to account for diffraction because they treat light as a particle. This results in converging beams focusing to a point instead of a diffraction limited spot, greatly effecting the accuracy of Monte Carlo simulations near the focal plane. Here, we present a technique capable of simulating a focusing beam in accordance to the rules of Gaussian optics, resulting in a diffraction limited focal spot. This technique can be easily implemented into any traditional Monte Carlo simulation allowing existing models to be converted to include accurate focusing geometries with minimal effort. We will present results for a focusing beam in a layered tissue model, demonstrating that for different scenarios the region of highest intensity, thus the greatest heating, can change from the surface to the focus. The ability to simulate accurate focusing geometries will greatly enhance the usefulness of Monte Carlo for countless applications, including studying laser tissue interactions in medical applications and light propagation through turbid media.

  9. Modeling the performance of direct-detection Doppler lidar systems including cloud and solar background variability.

    PubMed

    McGill, M J; Hart, W D; McKay, J A; Spinhirne, J D

    1999-10-20

    Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar system: the double-edge and the multichannel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only approximately 10-20% compared with nighttime performance, provided that a proper solar filter is included in the instrument design.

  10. Rocket nozzle thermal shock tests in an arc heater facility

    NASA Technical Reports Server (NTRS)

    Painter, James H.; Williamson, Ronald A.

    1986-01-01

    A rocket motor nozzle thermal structural test technique that utilizes arc heated nitrogen to simulate a motor burn was developed. The technique was used to test four heavily instrumented full-scale Star 48 rocket motor 2D carbon/carbon segments at conditions simulating the predicted thermal-structural environment. All four nozzles survived the tests without catastrophic or other structural failures. The test technique demonstrated promise as a low cost, controllable alternative to rocket motor firing. The technique includes the capability of rapid termination in the event of failure, allowing post-test analysis.

  11. A real time Pegasus propulsion system model for VSTOL piloted simulation evaluation

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Roth, S. P.; Creekmore, R.

    1981-01-01

    A real time propulsion system modeling technique suitable for use in man-in-the-loop simulator studies was developd. This technique provides the system accuracy, stability, and transient response required for integrated aircraft and propulsion control system studies. A Pegasus-Harrier propulsion system was selected as a baseline for developing mathematical modeling and simulation techniques for VSTOL. Initially, static and dynamic propulsion system characteristics were modeled in detail to form a nonlinear aerothermodynamic digital computer simulation of a Pegasus engine. From this high fidelity simulation, a real time propulsion model was formulated by applying a piece-wise linear state variable methodology. A hydromechanical and water injection control system was also simulated. The real time dynamic model includes the detail and flexibility required for the evaluation of critical control parameters and propulsion component limits over a limited flight envelope. The model was programmed for interfacing with a Harrier aircraft simulation. Typical propulsion system simulation results are presented.

  12. Generating Inviscid and Viscous Fluid Flow Simulations over a Surface Using a Quasi-simultaneous Technique

    NASA Technical Reports Server (NTRS)

    Sturdza, Peter (Inventor); Martins-Rivas, Herve (Inventor); Suzuki, Yoshifumi (Inventor)

    2014-01-01

    A fluid-flow simulation over a computer-generated surface is generated using a quasi-simultaneous technique. The simulation includes a fluid-flow mesh of inviscid and boundary-layer fluid cells. An initial fluid property for an inviscid fluid cell is determined using an inviscid fluid simulation that does not simulate fluid viscous effects. An initial boundary-layer fluid property a boundary-layer fluid cell is determined using the initial fluid property and a viscous fluid simulation that simulates fluid viscous effects. An updated boundary-layer fluid property is determined for the boundary-layer fluid cell using the initial fluid property, initial boundary-layer fluid property, and an interaction law. The interaction law approximates the inviscid fluid simulation using a matrix of aerodynamic influence coefficients computed using a two-dimensional surface panel technique and a fluid-property vector. An updated fluid property is determined for the inviscid fluid cell using the updated boundary-layer fluid property.

  13. Enhanced sampling techniques in biomolecular simulations.

    PubMed

    Spiwok, Vojtech; Sucur, Zoran; Hosek, Petr

    2015-11-01

    Biomolecular simulations are routinely used in biochemistry and molecular biology research; however, they often fail to match expectations of their impact on pharmaceutical and biotech industry. This is caused by the fact that a vast amount of computer time is required to simulate short episodes from the life of biomolecules. Several approaches have been developed to overcome this obstacle, including application of massively parallel and special purpose computers or non-conventional hardware. Methodological approaches are represented by coarse-grained models and enhanced sampling techniques. These techniques can show how the studied system behaves in long time-scales on the basis of relatively short simulations. This review presents an overview of new simulation approaches, the theory behind enhanced sampling methods and success stories of their applications with a direct impact on biotechnology or drug design. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Retinal Image Simulation of Subjective Refraction Techniques

    PubMed Central

    Perches, Sara; Collados, M. Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient’s response-guided refraction) is the most commonly used approach. In this context, this paper’s main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques—including Jackson’s Cross-Cylinder test (JCC)—relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software’s usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training. PMID:26938648

  15. The role of simulation in surgical training.

    PubMed Central

    Torkington, J.; Smith, S. G.; Rees, B. I.; Darzi, A.

    2000-01-01

    Surgical training has undergone many changes in the last decade. One outcome of these changes is the interest that has been generated in the possibility of training surgical skills outside the operating theatre. Simulation of surgical procedures and human tissue, if perfect, would allow complete transfer of techniques learnt in a skills laboratory directly to the operating theatre. Several techniques of simulation are available including artificial tissues, animal models and virtual reality computer simulation. Each is discussed in this article and their advantages and disadvantages considered. Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:10743423

  16. Networking Labs in the Online Environment: Indicators for Success

    ERIC Educational Resources Information Center

    Lahoud, Hilmi A.; Krichen, Jack P.

    2010-01-01

    Several techniques have been used to provide hands-on educational experiences to online learners, including remote labs, simulation software, and virtual labs, which offer a more structured environment, including simulations and scheduled asynchronous access to physical resources. This exploratory study investigated how these methods can be used…

  17. Fourth NASA Inter-Center Control Systems Conference

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Space vehicle control applications are discussed, along with aircraft guidance, control, and handling qualities. System simulation and identification, engine control, advanced propulsion techniques, and advanced control techniques are also included.

  18. A Novel Interfacing Technique for Distributed Hybrid Simulations Combining EMT and Transient Stability Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Dewu; Xie, Xiaorong; Jiang, Qirong

    With steady increase of power electronic devices and nonlinear dynamic loads in large scale AC/DC systems, the traditional hybrid simulation method, which incorporates these components into a single EMT subsystem and hence causes great difficulty for network partitioning and significant deterioration in simulation efficiency. To resolve these issues, a novel distributed hybrid simulation method is proposed in this paper. The key to realize this method is a distinct interfacing technique, which includes: i) a new approach based on the two-level Schur complement to update the interfaces by taking full consideration of the couplings between different EMT subsystems; and ii) amore » combined interaction protocol to further improve the efficiency while guaranteeing the simulation accuracy. The advantages of the proposed method in terms of both efficiency and accuracy have been verified by using it for the simulation study of an AC/DC hybrid system including a two-terminal VSC-HVDC and nonlinear dynamic loads.« less

  19. Some Techniques for Teaching about the Structure and Function of Chromosomes.

    ERIC Educational Resources Information Center

    Lowery, Roger; Taylor, Neil; Nathan, Subhashni

    2000-01-01

    Presents a teaching activity that uses photographs and diagrams to simulate two microscopic laboratory techniques used to observe the structure of chromosomes. Techniques include observation of squashed onion root tips and the salivary glands of some fruitfly larvae. (WRM)

  20. Integrated Clinical Training for Space Flight Using a High-Fidelity Patient Simulator in a Simulated Microgravity Environment

    NASA Technical Reports Server (NTRS)

    Hurst, Victor; Doerr, Harold K.; Polk, J. D.; Schmid, Josef; Parazynksi, Scott; Kelly, Scott

    2007-01-01

    This viewgraph presentation reviews the use of telemedicine in a simulated microgravity environment using a patient simulator. For decades, telemedicine techniques have been used in terrestrial environments by many cohorts with varied clinical experience. The success of these techniques has been recently expanded to include microgravity environments aboard the International Space Station (ISS). In order to investigate how an astronaut crew medical officer will execute medical tasks in a microgravity environment, while being remotely guided by a flight surgeon, the Medical Operation Support Team (MOST) used the simulated microgravity environment provided aboard DC-9 aircraft teams of crew medical officers, and remote flight surgeons performed several tasks on a patient simulator.

  1. Program to Optimize Simulated Trajectories (POST). Volume 1: Formulation manual

    NASA Technical Reports Server (NTRS)

    Brauer, G. L.; Cornick, D. E.; Habeger, A. R.; Petersen, F. M.; Stevenson, R.

    1975-01-01

    A general purpose FORTRAN program for simulating and optimizing point mass trajectories (POST) of aerospace vehicles is described. The equations and the numerical techniques used in the program are documented. Topics discussed include: coordinate systems, planet model, trajectory simulation, auxiliary calculations, and targeting and optimization.

  2. Systems modeling and simulation applications for critical care medicine

    PubMed Central

    2012-01-01

    Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718

  3. Analysis of dense-medium light scattering with applications to corneal tissue: experiments and Monte Carlo simulations.

    PubMed

    Kim, K B; Shanyfelt, L M; Hahn, D W

    2006-01-01

    Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.

  4. Investigation of Techniques for Simulating Communications and Tracking Subsystems on Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Deacetis, Louis A.

    1991-01-01

    The need to reduce the costs of Space Station Freedom has resulted in a major redesign and downsizing of the Station in general, and its Communications and Tracking (C&T) components in particular. Earlier models and simulations of the C&T Space-to-Ground Subsystem (SGS) in particular are no longer valid. There thus exists a general need for updated, high fidelity simulations of C&T subsystems. This project explored simulation techniques and methods that might be used in developing new simulations of C&T subsystems, including the SGS. Three requirements were placed on the simulations to be developed: (1) they run on IBM PC/XT/AT compatible computers; (2) they be written in Ada as much as possible; and (3) since control and monitoring of the C&T subsystems will involve communication via a MIL-STD-1553B serial bus, that the possibility of commanding the simulator and monitoring its sensors via that bus be included in the design of the simulator. The result of the project is a prototype of a simulation of the Assembly/Contingency Transponder of the SGS, written in Ada, which can be controlled from another PC via a MIL-STD-1553B bus.

  5. Hydrogeologic Unit Flow Characterization Using Transition Probability Geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, N L; Walker, J R; Carle, S F

    2003-11-21

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has several advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upwards sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow (HUF) package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids. An application of themore » technique involving probabilistic capture zone delineation for the Aberjona Aquifer in Woburn, Ma. is included.« less

  6. Reducing the Risks of Military Aircrew Training through Simulation Technology.

    ERIC Educational Resources Information Center

    Farrow, Douglas R.

    1982-01-01

    This discussion of the types of risks associated with military aircrew training and the varieties of training devices and techniques currently utilized to minimize those risks includes an examination of flight trainer simulators and complex mission simulators for coping with military aviation hazards. Four references are listed. (Author/MER)

  7. Research in Distance Education: A System Modeling Approach.

    ERIC Educational Resources Information Center

    Saba, Farhad; Twitchell, David

    1988-01-01

    Describes how a computer simulation research method can be used for studying distance education systems. Topics discussed include systems research in distance education; a technique of model development using the System Dynamics approach and DYNAMO simulation language; and a computer simulation of a prototype model. (18 references) (LRW)

  8. Modeling and simulation of dust behaviors behind a moving vehicle

    NASA Astrophysics Data System (ADS)

    Wang, Jingfang

    Simulation of physically realistic complex dust behaviors is a difficult and attractive problem in computer graphics. A fast, interactive and visually convincing model of dust behaviors behind moving vehicles is very useful in computer simulation, training, education, art, advertising, and entertainment. In my dissertation, an experimental interactive system has been implemented for the simulation of dust behaviors behind moving vehicles. The system includes physically-based models, particle systems, rendering engines and graphical user interface (GUI). I have employed several vehicle models including tanks, cars, and jeeps to test and simulate in different scenarios and conditions. Calm weather, winding condition, vehicle turning left or right, and vehicle simulation controlled by users from the GUI are all included. I have also tested the factors which play against the physical behaviors and graphics appearances of the dust particles through GUI or off-line scripts. The simulations are done on a Silicon Graphics Octane station. The animation of dust behaviors is achieved by physically-based modeling and simulation. The flow around a moving vehicle is modeled using computational fluid dynamics (CFD) techniques. I implement a primitive variable and pressure-correction approach to solve the three dimensional incompressible Navier Stokes equations in a volume covering the moving vehicle. An alternating- direction implicit (ADI) method is used for the solution of the momentum equations, with a successive-over- relaxation (SOR) method for the solution of the Poisson pressure equation. Boundary conditions are defined and simplified according to their dynamic properties. The dust particle dynamics is modeled using particle systems, statistics, and procedure modeling techniques. Graphics and real-time simulation techniques, such as dynamics synchronization, motion blur, blending, and clipping have been employed in the rendering to achieve realistic appearing dust behaviors. In addition, I introduce a temporal smoothing technique to eliminate the jagged effect caused by large simulation time. Several algorithms are used to speed up the simulation. For example, pre-calculated tables and display lists are created to replace some of the most commonly used functions, scripts and processes. The performance study shows that both time and space costs of the algorithms are linear in the number of particles in the system. On a Silicon Graphics Octane, three vehicles with 20,000 particles run at 6-8 frames per second on average. This speed does not include the extra calculations of convergence of the numerical integration for fluid dynamics which usually takes about 4-5 minutes to achieve steady state.

  9. Wood lens design philosophy based on a binary additive manufacturing technique

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  10. Cost effectiveness as applied to the Viking Lander systems-level thermal development test program

    NASA Technical Reports Server (NTRS)

    Buna, T.; Shupert, T. C.

    1974-01-01

    The economic aspects of thermal testing at the systems-level as applied to the Viking Lander Capsule thermal development program are reviewed. The unique mission profile and pioneering scientific goals of Viking imposed novel requirements on testing, including the development of a simulation technique for the Martian thermal environment. The selected approach included modifications of an existing conventional thermal vacuum facility, and improved test-operational techniques that are applicable to the simulation of the other mission phases as well, thereby contributing significantly to the cost effectiveness of the overall thermal test program.

  11. Simulating Drosophila Genetics with the Computer.

    ERIC Educational Resources Information Center

    Small, James W., Jr.; Edwards, Kathryn L.

    1979-01-01

    Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)

  12. Modeling the Performance of Direct-Detection Doppler Lidar Systems in Real Atmospheres

    NASA Technical Reports Server (NTRS)

    McGill, Matthew J.; Hart, William D.; McKay, Jack A.; Spinhirne, James D.

    1999-01-01

    Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems has assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar systems: the double-edge and the multi-channel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only about 10-20% compared to nighttime performance, provided a proper solar filter is included in the instrument design.

  13. Wind Energy System Time-domain (WEST) analyzers using hybrid simulation techniques

    NASA Technical Reports Server (NTRS)

    Hoffman, J. A.

    1979-01-01

    Two stand-alone analyzers constructed for real time simulation of the complex dynamic characteristics of horizontal-axis wind energy systems are described. Mathematical models for an aeroelastic rotor, including nonlinear aerodynamic and elastic loads, are implemented with high speed digital and analog circuitry. Models for elastic supports, a power train, a control system, and a rotor gimbal system are also included. Limited correlation efforts show good comparisons between results produced by the analyzers and results produced by a large digital simulation. The digital simulation results correlate well with test data.

  14. Advanced sensor-simulation capability

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.

    1990-09-01

    This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.

  15. Optimization of dual energy contrast enhanced breast tomosynthesis for improved mammographic lesion detection and diagnosis

    NASA Astrophysics Data System (ADS)

    Saunders, R.; Samei, E.; Badea, C.; Yuan, H.; Ghaghada, K.; Qi, Y.; Hedlund, L. W.; Mukundan, S.

    2008-03-01

    Dual-energy contrast-enhanced breast tomosynthesis has been proposed as a technique to improve the detection of early-stage cancer in young, high-risk women. This study focused on optimizing this technique using computer simulations. The computer simulation used analytical calculations to optimize the signal difference to noise ratio (SdNR) of resulting images from such a technique at constant dose. The optimization included the optimal radiographic technique, optimal distribution of dose between the two single-energy projection images, and the optimal weighting factor for the dual energy subtraction. Importantly, the SdNR included both anatomical and quantum noise sources, as dual energy imaging reduces anatomical noise at the expense of increases in quantum noise. Assuming a tungsten anode, the maximum SdNR at constant dose was achieved for a high energy beam at 49 kVp with 92.5 μm copper filtration and a low energy beam at 49 kVp with 95 μm tin filtration. These analytical calculations were followed by Monte Carlo simulations that included the effects of scattered radiation and detector properties. Finally, the feasibility of this technique was tested in a small animal imaging experiment using a novel iodinated liposomal contrast agent. The results illustrated the utility of dual energy imaging and determined the optimal acquisition parameters for this technique. This work was supported in part by grants from the Komen Foundation (PDF55806), the Cancer Research and Prevention Foundation, and the NIH (NCI R21 CA124584-01). CIVM is a NCRR/NCI National Resource under P41-05959/U24-CA092656.

  16. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  17. Computing in Secondary Physics at Armdale, W.A.

    ERIC Educational Resources Information Center

    Smith, Clifton L.

    1976-01-01

    An Australian secondary school physics course utilizing an electronic programmable calculator and computer is described. Calculation techniques and functions, programming techniques, and simulation of physical systems are detailed. A summary of student responses to the program is included. (BT)

  18. Computer Simulation of Diffraction Patterns.

    ERIC Educational Resources Information Center

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  19. Development of simulation techniques suitable for the analysis of air traffic control situations and instrumentation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A terminal area simulation is described which permits analysis and synthesis of current and advanced air traffic management system configurations including ground and airborne instrumentation and new and modified aircraft characteristics. Ground elements in the simulation include navigation aids, surveillance radars, communication links, air-route structuring, ATC procedures, airport geometries and runway handling constraints. Airborne elements include traffic samples with individual aircraft performance and operating characteristics and aircraft navigation equipment. The simulation also contains algorithms for conflict detection, conflict resolution, sequencing and pilot-controller data links. The simulation model is used to determine the sensitivities of terminal area traffic flow, safety and congestion to aircraft performance characteristics, avionics systems, and other ATC elements.

  20. CALM: Complex Adaptive System (CAS)-Based Decision Support for Enabling Organizational Change

    NASA Astrophysics Data System (ADS)

    Adler, Richard M.; Koehn, David J.

    Guiding organizations through transformational changes such as restructuring or adopting new technologies is a daunting task. Such changes generate workforce uncertainty, fear, and resistance, reducing morale, focus and performance. Conventional project management techniques fail to mitigate these disruptive effects, because social and individual changes are non-mechanistic, organic phenomena. CALM (for Change, Adaptation, Learning Model) is an innovative decision support system for enabling change based on CAS principles. CALM provides a low risk method for validating and refining change strategies that combines scenario planning techniques with "what-if" behavioral simulation. In essence, CALM "test drives" change strategies before rolling them out, allowing organizations to practice and learn from virtual rather than actual mistakes. This paper describes the CALM modeling methodology, including our metrics for measuring organizational readiness to respond to change and other major CALM scenario elements: prospective change strategies; alternate futures; and key situational dynamics. We then describe CALM's simulation engine for projecting scenario outcomes and its associated analytics. CALM's simulator unifies diverse behavioral simulation paradigms including: adaptive agents; system dynamics; Monte Carlo; event- and process-based techniques. CALM's embodiment of CAS dynamics helps organizations reduce risk and improve confidence and consistency in critical strategies for enabling transformations.

  1. Monte Carlo Simulation of Nonlinear Radiation Induced Plasmas. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Wang, B. S.

    1972-01-01

    A Monte Carlo simulation model for radiation induced plasmas with nonlinear properties due to recombination was, employing a piecewise linearized predict-correct iterative technique. Several important variance reduction techniques were developed and incorporated into the model, including an antithetic variates technique. This approach is especially efficient for plasma systems with inhomogeneous media, multidimensions, and irregular boundaries. The Monte Carlo code developed has been applied to the determination of the electron energy distribution function and related parameters for a noble gas plasma created by alpha-particle irradiation. The characteristics of the radiation induced plasma involved are given.

  2. A Simulation of AI Programming Techniques in BASIC.

    ERIC Educational Resources Information Center

    Mandell, Alan

    1986-01-01

    Explains the functions of and the techniques employed in expert systems. Offers the program "The Periodic Table Expert," as a model for using artificial intelligence techniques in BASIC. Includes the program listing and directions for its use on: Tandy 1000, 1200, and 2000; IBM PC; PC Jr; TRS-80; and Apple computers. (ML)

  3. Verification of component mode techniques for flexible multibody systems

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  4. Development and application of numerical techniques for general-relativistic magnetohydrodynamics simulations of black hole accretion

    NASA Astrophysics Data System (ADS)

    White, Christopher Joseph

    We describe the implementation of sophisticated numerical techniques for general-relativistic magnetohydrodynamics simulations in the Athena++ code framework. Improvements over many existing codes include the use of advanced Riemann solvers and of staggered-mesh constrained transport. Combined with considerations for computational performance and parallel scalability, these allow us to investigate black hole accretion flows with unprecedented accuracy. The capability of the code is demonstrated by exploring magnetically arrested disks.

  5. Quantum simulation of an ultrathin body field-effect transistor with channel imperfections

    NASA Astrophysics Data System (ADS)

    Vyurkov, V.; Semenikhin, I.; Filippov, S.; Orlikovsky, A.

    2012-04-01

    An efficient program for the all-quantum simulation of nanometer field-effect transistors is elaborated. The model is based on the Landauer-Buttiker approach. Our calculation of transmission coefficients employs a transfer-matrix technique involving the arbitrary precision (multiprecision) arithmetic to cope with evanescent modes. Modified in such way, the transfer-matrix technique turns out to be much faster in practical simulations than that of scattering-matrix. Results of the simulation demonstrate the impact of realistic channel imperfections (random charged centers and wall roughness) on transistor characteristics. The Landauer-Buttiker approach is developed to incorporate calculation of the noise at an arbitrary temperature. We also validate the ballistic Landauer-Buttiker approach for the usual situation when heavily doped contacts are indispensably included into the simulation region.

  6. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  7. Simulated annealing in orbital flight planning

    NASA Technical Reports Server (NTRS)

    Soller, Jeffrey

    1990-01-01

    Simulated annealing is used to solve a minimum fuel trajectory problem in the space station environment. The environment is unique because the space station will define the first true multivehicle environment in space. The optimization yields surfaces which are potentially complex, with multiple local minima. Because of the likelihood of these local minima, descent techniques are unable to offer robust solutions. Other deterministic optimization techniques were explored without success. The simulated annealing optimization is capable of identifying a minimum-fuel, two-burn trajectory subject to four constraints. Furthermore, the computational efforts involved in the optimization are such that missions could be planned on board the space station. Potential applications could include the on-site planning of rendezvous with a target craft of the emergency rescue of an astronaut. Future research will include multiwaypoint maneuvers, using a knowledge base to guide the optimization.

  8. The Numerical Technique for the Landslide Tsunami Simulations Based on Navier-Stokes Equations

    NASA Astrophysics Data System (ADS)

    Kozelkov, A. S.

    2017-12-01

    The paper presents an integral technique simulating all phases of a landslide-driven tsunami. The technique is based on the numerical solution of the system of Navier-Stokes equations for multiphase flows. The numerical algorithm uses a fully implicit approximation method, in which the equations of continuity and momentum conservation are coupled through implicit summands of pressure gradient and mass flow. The method we propose removes severe restrictions on the time step and allows simulation of tsunami propagation to arbitrarily large distances. The landslide origin is simulated as an individual phase being a Newtonian fluid with its own density and viscosity and separated from the water and air phases by an interface. The basic formulas of equation discretization and expressions for coefficients are presented, and the main steps of the computation procedure are described in the paper. To enable simulations of tsunami propagation across wide water areas, we propose a parallel algorithm of the technique implementation, which employs an algebraic multigrid method. The implementation of the multigrid method is based on the global level and cascade collection algorithms that impose no limitations on the paralleling scale and make this technique applicable to petascale systems. We demonstrate the possibility of simulating all phases of a landslide-driven tsunami, including its generation, propagation and uprush. The technique has been verified against the problems supported by experimental data. The paper describes the mechanism of incorporating bathymetric data to simulate tsunamis in real water areas of the world ocean. Results of comparison with the nonlinear dispersion theory, which has demonstrated good agreement, are presented for the case of a historical tsunami of volcanic origin on the Montserrat Island in the Caribbean Sea.

  9. A NEW TECHNIQUE FOR THE PHOTOSPHERIC DRIVING OF NON-POTENTIAL SOLAR CORONAL MAGNETIC FIELD SIMULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weinzierl, Marion; Yeates, Anthony R.; Mackay, Duncan H.

    2016-05-20

    In this paper, we develop a new technique for driving global non-potential simulations of the Sun’s coronal magnetic field solely from sequences of radial magnetic maps of the solar photosphere. A primary challenge to driving such global simulations is that the required horizontal electric field cannot be uniquely determined from such maps. We show that an “inductive” electric field solution similar to that used by previous authors successfully reproduces specific features of the coronal field evolution in both single and multiple bipole simulations. For these cases, the true solution is known because the electric field was generated from a surfacemore » flux-transport model. The match for these cases is further improved by including the non-inductive electric field contribution from surface differential rotation. Then, using this reconstruction method for the electric field, we show that a coronal non-potential simulation can be successfully driven from a sequence of ADAPT maps of the photospheric radial field, without including additional physical observations which are not routinely available.« less

  10. From classical to quantum and back: Hamiltonian adaptive resolution path integral, ring polymer, and centroid molecular dynamics

    NASA Astrophysics Data System (ADS)

    Kreis, Karsten; Kremer, Kurt; Potestio, Raffaello; Tuckerman, Mark E.

    2017-12-01

    Path integral-based methodologies play a crucial role for the investigation of nuclear quantum effects by means of computer simulations. However, these techniques are significantly more demanding than corresponding classical simulations. To reduce this numerical effort, we recently proposed a method, based on a rigorous Hamiltonian formulation, which restricts the quantum modeling to a small but relevant spatial region within a larger reservoir where particles are treated classically. In this work, we extend this idea and show how it can be implemented along with state-of-the-art path integral simulation techniques, including path-integral molecular dynamics, which allows for the calculation of quantum statistical properties, and ring-polymer and centroid molecular dynamics, which allow the calculation of approximate quantum dynamical properties. To this end, we derive a new integration algorithm that also makes use of multiple time-stepping. The scheme is validated via adaptive classical-path-integral simulations of liquid water. Potential applications of the proposed multiresolution method are diverse and include efficient quantum simulations of interfaces as well as complex biomolecular systems such as membranes and proteins.

  11. Water injection into vapor- and liquid-dominated reservoirs: Modeling of heat transfer and mass transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pruess, K.; Oldenburg, C.; Moridis, G.

    1997-12-31

    This paper summarizes recent advances in methods for simulating water and tracer injection, and presents illustrative applications to liquid- and vapor-dominated geothermal reservoirs. High-resolution simulations of water injection into heterogeneous, vertical fractures in superheated vapor zones were performed. Injected water was found to move in dendritic patterns, and to experience stronger lateral flow effects than predicted from homogeneous medium models. Higher-order differencing methods were applied to modeling water and tracer injection into liquid-dominated systems. Conventional upstream weighting techniques were shown to be adequate for predicting the migration of thermal fronts, while higher-order methods give far better accuracy for tracer transport.more » A new fluid property module for the TOUGH2 simulator is described which allows a more accurate description of geofluids, and includes mineral dissolution and precipitation effects with associated porosity and permeability change. Comparisons between numerical simulation predictions and data for laboratory and field injection experiments are summarized. Enhanced simulation capabilities include a new linear solver package for TOUGH2, and inverse modeling techniques for automatic history matching and optimization.« less

  12. MO-D-PinS Room/Hall E-00: MR Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    MRI, with its excellent soft tissue contrast and its ability to provide physiological as well as anatomical information, is becoming increasingly used in radiation therapy for treatment planning, image-guided radiation therapy, and treatment evaluation. This session will explore solutions to integrating MRI into the simulation process. Obstacles for using MRI for simulation include distortions and artifacts, image acquisition speed, complexity of imaging techniques, and lack of electron density information. Partners in Solutions presents vendor representatives who will present their approaches to meeting these challenges and others. An increased awareness of how MRI simulation works will allow physicists to better understandmore » and use this powerful technique. The speakers are all employees who are presenting information about their company’s products.« less

  13. Information prioritization for control and automation of space operations

    NASA Technical Reports Server (NTRS)

    Ray, Asock; Joshi, Suresh M.; Whitney, Cynthia K.; Jow, Hong N.

    1987-01-01

    The applicability of a real-time information prioritization technique to the development of a decision support system for control and automation of Space Station operations is considered. The steps involved in the technique are described, including the definition of abnormal scenarios and of attributes, measures of individual attributes, formulation and optimization of a cost function, simulation of test cases on the basis of the cost function, and examination of the simulation scenerios. A list is given comparing the intrinsic importances of various Space Station information data.

  14. An analytical investigation of NO sub x control techniques for methanol fueled spark ignition engines

    NASA Technical Reports Server (NTRS)

    Browning, L. H.; Argenbright, L. A.

    1983-01-01

    A thermokinetic SI engine simulation was used to study the effects of simple nitrogen oxide control techniques on performance and emissions of a methanol fueled engine. As part of this simulation, a ring crevice storage model was formulated to predict UBF emissions. The study included spark retard, two methods of compression ratio increase and EGR. The study concludes that use of EGR in high turbulence, high compression engines will both maximize power and thermal efficiency while minimizing harmful exhaust pollutants.

  15. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  16. State of the Art Assessment of Simulation in Advanced Materials Development

    NASA Technical Reports Server (NTRS)

    Wise, Kristopher E.

    2008-01-01

    Advances in both the underlying theory and in the practical implementation of molecular modeling techniques have increased their value in the advanced materials development process. The objective is to accelerate the maturation of emerging materials by tightly integrating modeling with the other critical processes: synthesis, processing, and characterization. The aims of this report are to summarize the state of the art of existing modeling tools and to highlight a number of areas in which additional development is required. In an effort to maintain focus and limit length, this survey is restricted to classical simulation techniques including molecular dynamics and Monte Carlo simulations.

  17. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  18. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  19. Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics

    NASA Astrophysics Data System (ADS)

    Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.

    2006-06-01

    Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.

  20. A Compendium of Assessment Techniques.

    ERIC Educational Resources Information Center

    Knapp, Joan; Sharon, Amiel

    A wide variety of techniques are appropriate for evaluating different types of experiential learning. This monograph includes sections on performance tests, simulations, interviews, ratings, self evaluation, and product assessment. Each section orients the reader to one of these general types of assessment and then provides brief illustrations of…

  1. Opto-electronic characterization of third-generation solar cells.

    PubMed

    Neukom, Martin; Züfle, Simon; Jenatsch, Sandra; Ruhstaller, Beat

    2018-01-01

    We present an overview of opto-electronic characterization techniques for solar cells including light-induced charge extraction by linearly increasing voltage, impedance spectroscopy, transient photovoltage, charge extraction and more. Guidelines for the interpretation of experimental results are derived based on charge drift-diffusion simulations of solar cells with common performance limitations. It is investigated how nonidealities like charge injection barriers, traps and low mobilities among others manifest themselves in each of the studied cell characterization techniques. Moreover, comprehensive parameter extraction for an organic bulk-heterojunction solar cell comprising PCDTBT:PC 70 BM is demonstrated. The simulations reproduce measured results of 9 different experimental techniques. Parameter correlation is minimized due to the combination of various techniques. Thereby a route to comprehensive and accurate parameter extraction is identified.

  2. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  3. Design Of Combined Stochastic Feedforward/Feedback Control

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim

    1989-01-01

    Methodology accommodates variety of control structures and design techniques. In methodology for combined stochastic feedforward/feedback control, main objectives of feedforward and feedback control laws seen clearly. Inclusion of error-integral feedback, dynamic compensation, rate-command control structure, and like integral element of methodology. Another advantage of methodology flexibility to develop variety of techniques for design of feedback control with arbitrary structures to obtain feedback controller: includes stochastic output feedback, multiconfiguration control, decentralized control, or frequency and classical control methods. Control modes of system include capture and tracking of localizer and glideslope, crab, decrab, and flare. By use of recommended incremental implementation, control laws simulated on digital computer and connected with nonlinear digital simulation of aircraft and its systems.

  4. Efficient morse decompositions of vector fields.

    PubMed

    Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene

    2008-01-01

    Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.

  5. Demonstration of landfill gas enhancement techniques in landfill simulators

    NASA Astrophysics Data System (ADS)

    Walsh, J. J.; Vogt, W. G.

    1982-02-01

    Various techniques to enhance gas production in sanitary landfills were applied to landfill simulators. These techniques include (1) accelerated moisture addition, (2) leachate recycling, (3) buffer addition, (4) nutrient addition, and (5) combinations of the above. Results are compiled through on-going operation and monitoring of sixteen landfill simulators. These test cells contain about 380 kg of municipal solid waste. Quantities of buffer and nutrient materials were placed in selected cells at the time of loading. Water is added to all test cells on a monthly basis; leachate is withdrawn from all cells (and recycled on selected cells) also on a monthly basis. Daily monitoring of gas volumes and refuse temperatures is performed. Gas and leachate samples are collected and analyzed on a monthly basis. Leachate and gas quality and quantity reslts are presented for the first 18 months of operation.

  6. Computer simulation techniques for artificial modification of the ionosphere. Final report 31 jan 79-30 apr 81

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vance, B.; Mendillo, M.

    1981-04-30

    A three-dimensional model of the ionosphere was developed including chemical reactions and neutral and plasma transport. The model uses Finite Element Simulation to simulate ionospheric modification rather than solving a set of differential equations. The initial conditions of the Los Alamos Scientific Laboratory experiments, Lagopedo Uno and Dos, were input to the model, and these events were simulated. Simulation results were compared to ground and rocketborne electron-content measurements. A simulation of the transport of released SF6 was also made.

  7. Material point method modeling in oil and gas reservoirs

    DOEpatents

    Vanderheyden, William Brian; Zhang, Duan

    2016-06-28

    A computer system and method of simulating the behavior of an oil and gas reservoir including changes in the margins of frangible solids. A system of equations including state equations such as momentum, and conservation laws such as mass conservation and volume fraction continuity, are defined and discretized for at least two phases in a modeled volume, one of which corresponds to frangible material. A material point model technique for numerically solving the system of discretized equations, to derive fluid flow at each of a plurality of mesh nodes in the modeled volume, and the velocity of at each of a plurality of particles representing the frangible material in the modeled volume. A time-splitting technique improves the computational efficiency of the simulation while maintaining accuracy on the deformation scale. The method can be applied to derive accurate upscaled model equations for larger volume scale simulations.

  8. Data mining to support simulation modeling of patient flow in hospitals.

    PubMed

    Isken, Mark W; Rajagopalan, Balaji

    2002-04-01

    Spiraling health care costs in the United States are driving institutions to continually address the challenge of optimizing the use of scarce resources. One of the first steps towards optimizing resources is to utilize capacity effectively. For hospital capacity planning problems such as allocation of inpatient beds, computer simulation is often the method of choice. One of the more difficult aspects of using simulation models for such studies is the creation of a manageable set of patient types to include in the model. The objective of this paper is to demonstrate the potential of using data mining techniques, specifically clustering techniques such as K-means, to help guide the development of patient type definitions for purposes of building computer simulation or analytical models of patient flow in hospitals. Using data from a hospital in the Midwest this study brings forth several important issues that researchers need to address when applying clustering techniques in general and specifically to hospital data.

  9. Motion-Based Piloted Simulation Evaluation of a Control Allocation Technique to Recover from Pilot Induced Oscillations

    NASA Technical Reports Server (NTRS)

    Craun, Robert W.; Acosta, Diana M.; Beard, Steven D.; Leonard, Michael W.; Hardy, Gordon H.; Weinstein, Michael; Yildiz, Yildiray

    2013-01-01

    This paper describes the maturation of a control allocation technique designed to assist pilots in the recovery from pilot induced oscillations (PIOs). The Control Allocation technique to recover from Pilot Induced Oscillations (CAPIO) is designed to enable next generation high efficiency aircraft designs. Energy efficient next generation aircraft require feedback control strategies that will enable lowering the actuator rate limit requirements for optimal airframe design. One of the common issues flying with actuator rate limits is PIOs caused by the phase lag between the pilot inputs and control surface response. CAPIO utilizes real-time optimization for control allocation to eliminate phase lag in the system caused by control surface rate limiting. System impacts of the control allocator were assessed through a piloted simulation evaluation of a non-linear aircraft simulation in the NASA Ames Vertical Motion Simulator. Results indicate that CAPIO helps reduce oscillatory behavior, including the severity and duration of PIOs, introduced by control surface rate limiting.

  10. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.

  11. Comparing Three Estimation Methods for the Three-Parameter Logistic IRT Model

    ERIC Educational Resources Information Center

    Lamsal, Sunil

    2015-01-01

    Different estimation procedures have been developed for the unidimensional three-parameter item response theory (IRT) model. These techniques include the marginal maximum likelihood estimation, the fully Bayesian estimation using Markov chain Monte Carlo simulation techniques, and the Metropolis-Hastings Robbin-Monro estimation. With each…

  12. Simulating Reflex Induced Changes in the Acoustic Impedance of the Ear.

    ERIC Educational Resources Information Center

    Sirlin, Mindy W.; Levitt, Harry

    1991-01-01

    A simple procedure for measuring changes in the acoustic impedance of the ear is described. The technique has several applications, including simulation using a standard coupler of changes in real ear impedance produced by the acoustic reflex, and calibration of response time of an otoadmittance meter. (Author/DB)

  13. Modeling target normal sheath acceleration using handoffs between multiple simulations

    NASA Astrophysics Data System (ADS)

    McMahon, Matthew; Willis, Christopher; Mitchell, Robert; King, Frank; Schumacher, Douglass; Akli, Kramer; Freeman, Richard

    2013-10-01

    We present a technique to model the target normal sheath acceleration (TNSA) process using full-scale LSP PIC simulations. The technique allows for a realistic laser, full size target and pre-plasma, and sufficient propagation length for the accelerated ions and electrons. A first simulation using a 2D Cartesian grid models the laser-plasma interaction (LPI) self-consistently and includes field ionization. Electrons accelerated by the laser are imported into a second simulation using a 2D cylindrical grid optimized for the initial TNSA process and incorporating an equation of state. Finally, all of the particles are imported to a third simulation optimized for the propagation of the accelerated ions and utilizing a static field solver for initialization. We also show use of 3D LPI simulations. Simulation results are compared to recent ion acceleration experiments using SCARLET laser at The Ohio State University. This work was performed with support from ASOFR under contract # FA9550-12-1-0341, DARPA, and allocations of computing time from the Ohio Supercomputing Center.

  14. Finite difference model for aquifer simulation in two dimensions with results of numerical experiments

    USGS Publications Warehouse

    Trescott, Peter C.; Pinder, George Francis; Larson, S.P.

    1976-01-01

    The model will simulate ground-water flow in an artesian aquifer, a water-table aquifer, or a combined artesian and water-table aquifer. The aquifer may be heterogeneous and anisotropic and have irregular boundaries. The source term in the flow equation may include well discharge, constant recharge, leakage from confining beds in which the effects of storage are considered, and evapotranspiration as a linear function of depth to water. The theoretical development includes presentation of the appropriate flow equations and derivation of the finite-difference approximations (written for a variable grid). The documentation emphasizes the numerical techniques that can be used for solving the simultaneous equations and describes the results of numerical experiments using these techniques. Of the three numerical techniques available in the model, the strongly implicit procedure, in general, requires less computer time and has fewer numerical difficulties than do the iterative alternating direction implicit procedure and line successive overrelaxation (which includes a two-dimensional correction procedure to accelerate convergence). The documentation includes a flow chart, program listing, an example simulation, and sections on designing an aquifer model and requirements for data input. It illustrates how model results can be presented on the line printer and pen plotters with a program that utilizes the graphical display software available from the Geological Survey Computer Center Division. In addition the model includes options for reading input data from a disk and writing intermediate results on a disk.

  15. Flexible multibody simulation of automotive systems with non-modal model reduction techniques

    NASA Astrophysics Data System (ADS)

    Shiiba, Taichi; Fehr, Jörg; Eberhard, Peter

    2012-12-01

    The stiffness of the body structure of an automobile has a strong relationship with its noise, vibration, and harshness (NVH) characteristics. In this paper, the effect of the stiffness of the body structure upon ride quality is discussed with flexible multibody dynamics. In flexible multibody simulation, the local elastic deformation of the vehicle has been described traditionally with modal shape functions. Recently, linear model reduction techniques from system dynamics and mathematics came into the focus to find more sophisticated elastic shape functions. In this work, the NVH-relevant states of a racing kart are simulated, whereas the elastic shape functions are calculated with modern model reduction techniques like moment matching by projection on Krylov-subspaces, singular value decomposition-based reduction techniques, and combinations of those. The whole elastic multibody vehicle model consisting of tyres, steering, axle, etc. is considered, and an excitation with a vibration characteristics in a wide frequency range is evaluated in this paper. The accuracy and the calculation performance of those modern model reduction techniques is investigated including a comparison of the modal reduction approach.

  16. Protein-membrane electrostatic interactions: Application of the Lekner summation technique

    NASA Astrophysics Data System (ADS)

    Juffer, André H.; Shepherd, Craig M.; Vogel, Hans J.

    2001-01-01

    A model has been developed to calculate the electrostatic interaction between biomolecules and lipid bilayers. The effect of ionic strength is included by means of explicit ions, while water is described as a background continuum. The bilayer is considered at the atomic level. The Lekner summation technique is employed to calculate the long-range electrostatic interactions. The new method is employed to estimate the electrostatic contribution to the free energy of binding of sandostatin, a cyclic eight-residue analogue of the peptide hormone somatostatin, to lipid bilayers with thermodynamic integration. Monte Carlo simulation techniques were employed to determine ion distributions and peptide orientations. Both neutral as well as negatively charged lipid bilayers were used. An error analysis to judge the quality of the computation is also presented. The applicability of the Lekner summation technique to combine it with computer simulation models that simulate the adsorption of peptides (and proteins) into the interfacial region of lipid bilayers is discussed.

  17. The technique for Simulation of Transient Combustion Processes in the Rocket Engine Operating with Gaseous Fuel “Hydrogen and Oxygen”

    NASA Astrophysics Data System (ADS)

    Zubanov, V. M.; Stepanov, D. V.; Shabliy, L. S.

    2017-01-01

    The article describes the method for simulation of transient combustion processes in the rocket engine. The engine operates on gaseous propellant: oxygen and hydrogen. Combustion simulation was performed using the ANSYS CFX software. Three reaction mechanisms for the stationary mode were considered and described in detail. Reactions mechanisms have been taken from several sources and verified. The method for converting ozone properties from the Shomate equation to the NASA-polynomial format was described in detail. The way for obtaining quick CFD-results with intermediate combustion components using an EDM model was found. Modeling difficulties with combustion model Finite Rate Chemistry, associated with a large scatter of reference data were identified and described. The way to generate the Flamelet library with CFX-RIF is described. Formulated adequate reaction mechanisms verified at a steady state have also been tested for transient simulation. The Flamelet combustion model was recognized as adequate for the transient mode. Integral parameters variation relates to the values obtained during stationary simulation. A cyclic irregularity of the temperature field, caused by precession of the vortex core, was detected in the chamber with the proposed simulation technique. Investigations of unsteady processes of rocket engines including the processes of ignition were proposed as the area for application of the described simulation technique.

  18. Fixed gain and adaptive techniques for rotorcraft vibration control

    NASA Technical Reports Server (NTRS)

    Roy, R. H.; Saberi, H. A.; Walker, R. A.

    1985-01-01

    The results of an analysis effort performed to demonstrate the feasibility of employing approximate dynamical models and frequency shaped cost functional control law desgin techniques for helicopter vibration suppression are presented. Both fixed gain and adaptive control designs based on linear second order dynamical models were implemented in a detailed Rotor Systems Research Aircraft (RSRA) simulation to validate these active vibration suppression control laws. Approximate models of fuselage flexibility were included in the RSRA simulation in order to more accurately characterize the structural dynamics. The results for both the fixed gain and adaptive approaches are promising and provide a foundation for pursuing further validation in more extensive simulation studies and in wind tunnel and/or flight tests.

  19. The Computer in Educational Decision Making. An Introduction and Guide for School Administrators.

    ERIC Educational Resources Information Center

    Sanders, Susan; And Others

    This text provides educational administrators with a working knowledge of the problem-solving techniques of PERT (planning, evaluation, and review technique), Linear Programming, Queueing Theory, and Simulation. The text includes an introduction to decision-making and operations research, four chapters consisting of indepth explanations of each…

  20. Simulation of FIB-SEM images for analysis of porous microstructures.

    PubMed

    Prill, Torben; Schladitz, Katja

    2013-01-01

    Focused ion beam nanotomography-scanning electron microscopy tomography yields high-quality three-dimensional images of materials microstructures at the nanometer scale combining serial sectioning using a focused ion beam with SEM. However, FIB-SEM tomography of highly porous media leads to shine-through artifacts preventing automatic segmentation of the solid component. We simulate the SEM process in order to generate synthetic FIB-SEM image data for developing and validating segmentation methods. Monte-Carlo techniques yield accurate results, but are too slow for the simulation of FIB-SEM tomography requiring hundreds of SEM images for one dataset alone. Nevertheless, a quasi-analytic description of the specimen and various acceleration techniques, including a track compression algorithm and an acceleration for the simulation of secondary electrons, cut down the computing time by orders of magnitude, allowing for the first time to simulate FIB-SEM tomography. © Wiley Periodicals, Inc.

  1. Pyrite: A blender plugin for visualizing molecular dynamics simulations using industry-standard rendering techniques.

    PubMed

    Rajendiran, Nivedita; Durrant, Jacob D

    2018-05-05

    Molecular dynamics (MD) simulations provide critical insights into many biological mechanisms. Programs such as VMD, Chimera, and PyMOL can produce impressive simulation visualizations, but they lack many advanced rendering algorithms common in the film and video-game industries. In contrast, the modeling program Blender includes such algorithms but cannot import MD-simulation data. MD trajectories often require many gigabytes of memory/disk space, complicating Blender import. We present Pyrite, a Blender plugin that overcomes these limitations. Pyrite allows researchers to visualize MD simulations within Blender, with full access to Blender's cutting-edge rendering techniques. We expect Pyrite-generated images to appeal to students and non-specialists alike. A copy of the plugin is available at http://durrantlab.com/pyrite/, released under the terms of the GNU General Public License Version 3. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  2. Monte Carlo simulations of particle acceleration at oblique shocks: Including cross-field diffusion

    NASA Technical Reports Server (NTRS)

    Baring, M. G.; Ellison, D. C.; Jones, F. C.

    1995-01-01

    The Monte Carlo technique of simulating diffusive particle acceleration at shocks has made spectral predictions that compare extremely well with particle distributions observed at the quasi-parallel region of the earth's bow shock. The current extension of this work to compare simulation predictions with particle spectra at oblique interplanetary shocks has required the inclusion of significant cross-field diffusion (strong scattering) in the simulation technique, since oblique shocks are intrinsically inefficient in the limit of weak scattering. In this paper, we present results from the method we have developed for the inclusion of cross-field diffusion in our simulations, namely model predictions of particle spectra downstream of oblique subluminal shocks. While the high-energy spectral index is independent of the shock obliquity and the strength of the scattering, the latter is observed to profoundly influence the efficiency of injection of cosmic rays into the acceleration process.

  3. NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1

    NASA Technical Reports Server (NTRS)

    Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)

    1990-01-01

    The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.

  4. Opto-electronic characterization of third-generation solar cells

    PubMed Central

    Jenatsch, Sandra

    2018-01-01

    Abstract We present an overview of opto-electronic characterization techniques for solar cells including light-induced charge extraction by linearly increasing voltage, impedance spectroscopy, transient photovoltage, charge extraction and more. Guidelines for the interpretation of experimental results are derived based on charge drift-diffusion simulations of solar cells with common performance limitations. It is investigated how nonidealities like charge injection barriers, traps and low mobilities among others manifest themselves in each of the studied cell characterization techniques. Moreover, comprehensive parameter extraction for an organic bulk-heterojunction solar cell comprising PCDTBT:PC70BM is demonstrated. The simulations reproduce measured results of 9 different experimental techniques. Parameter correlation is minimized due to the combination of various techniques. Thereby a route to comprehensive and accurate parameter extraction is identified. PMID:29707069

  5. Moving Target Techniques: Leveraging Uncertainty for CyberDefense

    DTIC Science & Technology

    2015-12-15

    cyberattacks is a continual struggle for system managers. Attackers often need only find one vulnerability (a flaw or bug that an attacker can exploit...additional parsing code itself could have security-relevant software bugs . Dynamic  Network   Techniques in the dynamic network domain change the...evaluation of MT techniques can benefit from a variety of evaluation approaches, including abstract analysis, modeling and simulation, test bed

  6. Solid State Audio/Speech Processor Analysis.

    DTIC Science & Technology

    1980-03-01

    techniques. The techniques were demonstrated to be worthwhile in an efficient realtime AWR system. Finally, microprocessor architectures were designed to...do not include custom chip development, detailed hardware design , construction or testing. ITTDCD is very encouraged by the results obtained in this...California, Berkley, was responsible for furnishing the simulation data of OD speech analysis techniques and for the design and development of the hardware OD

  7. Impact gages for detecting meteoroid and other orbital debris impacts on space vehicles.

    NASA Technical Reports Server (NTRS)

    Mastandrea, J. R.; Scherb, M. V.

    1973-01-01

    Impacts on space vehicles have been simulated using the McDonnell Douglas Aerophysics Laboratory (MDAL) Light-Gas Guns to launch particles at hypervelocity speeds into scaled space structures. Using impact gages and a triangulation technique, these impacts have been detected and accurately located. This paper describes in detail the various types of impact gages (piezoelectric PZT-5A, quartz, electret, and off-the-shelf plastics) used. This description includes gage design and experimental results for gages installed on single-walled scaled payload carriers, multiple-walled satellites and space stations, and single-walled full-scale Delta tank structures. A brief description of the triangulation technique, the impact simulation, and the data acquisition system are also included.

  8. Rapid Automated Aircraft Simulation Model Updating from Flight Data

    NASA Technical Reports Server (NTRS)

    Brian, Geoff; Morelli, Eugene A.

    2011-01-01

    Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.

  9. Modeling Amorphous Microporous Polymers for CO2 Capture and Separations.

    PubMed

    Kupgan, Grit; Abbott, Lauren J; Hart, Kyle E; Colina, Coray M

    2018-06-13

    This review concentrates on the advances of atomistic molecular simulations to design and evaluate amorphous microporous polymeric materials for CO 2 capture and separations. A description of atomistic molecular simulations is provided, including simulation techniques, structural generation approaches, relaxation and equilibration methodologies, and considerations needed for validation of simulated samples. The review provides general guidelines and a comprehensive update of the recent literature (since 2007) to promote the acceleration of the discovery and screening of amorphous microporous polymers for CO 2 capture and separation processes.

  10. Fast Learning for Immersive Engagement in Energy Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M

    The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less

  11. What can virtual patient simulation offer mental health nursing education?

    PubMed

    Guise, V; Chambers, M; Välimäki, M

    2012-06-01

    This paper discusses the use of simulation in nursing education and training, including potential benefits and barriers associated with its use. In particular, it addresses the hitherto scant application of diverse simulation devices and dedicated simulation scenarios in psychiatric and mental health nursing. It goes on to describe a low-cost, narrative-based virtual patient simulation technique which has the potential for wide application within health and social care education. An example of the implementation of this technology in a web-based pilot course for acute mental health nurses is given. This particular virtual patient technique is a simulation type ideally suited to promoting essential mental health nursing skills such as critical thinking, communication and decision making. Furthermore, it is argued that it is particularly amenable to e-learning and blended learning environments, as well as being an apt tool where multilingual simulations are required. The continued development, implementation and evaluation of narrative virtual patient simulations across a variety of health and social care programmes would help ascertain their success as an educational tool. © 2011 Blackwell Publishing.

  12. Using Simulations to Investigate the Longitudinal Stability of Alternative Schemes for Classifying and Identifying Children with Reading Disabilities

    ERIC Educational Resources Information Center

    Schatschneider, Christopher; Wagner, Richard K.; Hart, Sara A.; Tighe, Elizabeth L.

    2016-01-01

    The present study employed data simulation techniques to investigate the 1-year stability of alternative classification schemes for identifying children with reading disabilities. Classification schemes investigated include low performance, unexpected low performance, dual-discrepancy, and a rudimentary form of constellation model of reading…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, Gennady; /Fermilab

    CST Particle Studio combines electromagnetic field simulation, multi-particle tracking, adequate post-processing and advanced probabilistic emission model, which is the most important new capability in multipactor simulation. The emission model includes in simulation the stochastic properties of emission and adds primary electron elastic and inelastic reflection from the surfaces. The simulation of multipactor in coaxial waveguides have been performed to study the effects of the innovations on the multipactor threshold and the range over which multipactor can occur. The results compared with available previous experiments and simulations as well as the technique of MP simulation with CST PS are presented andmore » discussed.« less

  14. Tests for malingering in ophthalmology

    PubMed Central

    Incesu, Ali Ihsan

    2013-01-01

    Simulation can be defined as malingering, or sometimes functional visual loss (FVL). It manifests as either simulating an ophthalmic disease (positive simulation), or denial of ophthalmic disease (negative simulation). Conscious behavior and compensation or indemnity claims are prominent features of simulation. Since some authors suggest that this is a manifestation of underlying psychopathology, even conversion is included in this context. In today's world, every ophthalmologist can face with simulation of ophthalmic disease or disorder. In case of simulation suspect, the physician's responsibility is to prove the simulation considering the disease/disorder first, and simulation as an exclusion. In simulation examinations, the physician should be firm and smart to select appropriate test(s) to convince not only the subject, but also the judge in case of indemnity or compensation trials. Almost all ophthalmic sensory and motor functions including visual acuity, visual field, color vision and night vision can be the subject of simulation. Examiner must be skillful in selecting the most appropriate test. Apart from those in the literature, we included all kinds of simulation in ophthalmology. In addition, simulation examination techniques, such as, use of optical coherence tomography, frequency doubling perimetry (FDP), and modified polarization tests were also included. In this review, we made a thorough literature search, and added our experiences to give the readers up-to-date information on malingering or simulation in ophthalmology. PMID:24195054

  15. A review of recent developments in flight test techniques at the Ames Research Center, Dryden Flight Research Facility

    NASA Technical Reports Server (NTRS)

    Layton, G. P.

    1984-01-01

    New flight test techniques in use at Ames Dryden are reviewed. The use of the pilot in combination with ground and airborne computational capabilities to maximize data return is discussed, including the remotely piloted research vehicle technique for high-risk testing, the remotely augmented vehicle technique for handling qualities research, and use of ground computed flight director information to fly unique profiles such as constant Reynolds number profiles through the transonic flight regime. Techniques used for checkout and design verification of systems-oriented aircraft are discussed, including descriptions of the various simulations, iron bird setups, and vehicle tests. Some newly developed techniques to support the aeronautical research disciplines are discussed, including a new approach to position-error determination, and the use of a large skin friction balance for the measurement of drag caused by various excrescencies.

  16. Simulation of FRET dyes allows quantitative comparison against experimental data

    NASA Astrophysics Data System (ADS)

    Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander

    2018-03-01

    Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.

  17. Robust Nonlinear Feedback Control of Aircraft Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Garrard, William L.; Balas, Gary J.; Litt, Jonathan (Technical Monitor)

    2001-01-01

    This is the final report on the research performed under NASA Glen grant NASA/NAG-3-1975 concerning feedback control of the Pratt & Whitney (PW) STF 952, a twin spool, mixed flow, after burning turbofan engine. The research focussed on the design of linear and gain-scheduled, multivariable inner-loop controllers for the PW turbofan engine using H-infinity and linear, parameter-varying (LPV) control techniques. The nonlinear turbofan engine simulation was provided by PW within the NASA Rocket Engine Transient Simulator (ROCETS) simulation software environment. ROCETS was used to generate linearized models of the turbofan engine for control design and analysis as well as the simulation environment to evaluate the performance and robustness of the controllers. Comparison between the H-infinity, and LPV controllers are made with the baseline multivariable controller and developed by Pratt & Whitney engineers included in the ROCETS simulation. Simulation results indicate that H-infinity and LPV techniques effectively achieve desired response characteristics with minimal cross coupling between commanded values and are very robust to unmodeled dynamics and sensor noise.

  18. Status of the Flooding Fragility Testing Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, C. L.; Savage, B.; Bhandari, B.

    2016-06-01

    This report provides an update on research addressing nuclear power plant component reliability under flooding conditions. The research includes use of the Component Flooding Evaluation Laboratory (CFEL) where individual components and component subassemblies will be tested to failure under various flooding conditions. The resulting component reliability data can then be incorporated with risk simulation strategies to provide a more thorough representation of overall plant risk. The CFEL development strategy consists of four interleaved phases. Phase 1 addresses design and application of CFEL with water rise and water spray capabilities allowing testing of passive and active components including fully electrified components.more » Phase 2 addresses research into wave generation techniques followed by the design and addition of the wave generation capability to CFEL. Phase 3 addresses methodology development activities including small scale component testing, development of full scale component testing protocol, and simulation techniques including Smoothed Particle Hydrodynamic (SPH) based computer codes. Phase 4 involves full scale component testing including work on full scale component testing in a surrogate CFEL testing apparatus.« less

  19. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects

    PubMed Central

    Lambers, Martin; Kolb, Andreas

    2017-01-01

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data. PMID:29271888

  20. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.

    PubMed

    Bulczak, David; Lambers, Martin; Kolb, Andreas

    2017-12-22

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  1. Simulators IV; Proceedings of the SCS Conference, Orlando, FL, Apr. 6-9, 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fairchild, B.T.

    1987-01-01

    The conference presents papers on the applicability of AI techniques to simulation models, the simulation of a reentry vehicle on Simstar, simstar missile simulation, measurement issues associated with simulator sickness, and tracing the etiology of simulator sickness. Consideration is given to a simulator of a steam generator tube bundle response to a blowdown transient, the census of simulators for fossil fueled boiler and gas turbine plant operation training, and a new approach for flight simulator visual systems. Other topics include past and present simulated aircraft maintenance trainers, an AI-simulation based approach for aircraft maintenance training, simulator qualification using EPRI methodology,more » and the role of instinct in organizational dysfunction.« less

  2. Methodologies for extracting kinetic constants for multiphase reacting flow simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, S.L.; Lottes, S.A.; Golchert, B.

    1997-03-01

    Flows in industrial reactors often involve complex reactions of many species. A computational fluid dynamics (CFD) computer code, ICRKFLO, was developed to simulate multiphase, multi-species reacting flows. The ICRKFLO uses a hybrid technique to calculate species concentration and reaction for a large number of species in a reacting flow. This technique includes a hydrodynamic and reacting flow simulation with a small but sufficient number of lumped reactions to compute flow field properties followed by a calculation of local reaction kinetics and transport of many subspecies (order of 10 to 100). Kinetic rate constants of the numerous subspecies chemical reactions aremore » difficult to determine. A methodology has been developed to extract kinetic constants from experimental data efficiently. A flow simulation of a fluid catalytic cracking (FCC) riser was successfully used to demonstrate this methodology.« less

  3. Exploring novel objective functions for simulating muscle coactivation in the neck.

    PubMed

    Mortensen, J; Trkov, M; Merryweather, A

    2018-04-11

    Musculoskeletal modeling allows for analysis of individual muscles in various situations. However, current techniques to realistically simulate muscle response when significant amounts of intentional coactivation is required are inadequate. This would include stiffening the neck or spine through muscle coactivation in preparation for perturbations or impacts. Muscle coactivation has been modeled previously in the neck and spine using optimization techniques that seek to maximize the joint stiffness by maximizing total muscle activation or muscle force. These approaches have not sought to replicate human response, but rather to explore the possible effects of active muscle. Coactivation remains a challenging feature to include in musculoskeletal models, and may be improved by extracting optimization objective functions from experimental data. However, the components of such an objective function must be known before fitting to experimental data. This study explores the effect of components in several objective functions, in order to recommend components to be used for fitting to experimental data. Four novel approaches to modeling coactivation through optimization techniques are presented, two of which produce greater levels of stiffness than previous techniques. Simulations were performed using OpenSim and MATLAB cooperatively. Results show that maximizing the moment generated by a particular muscle appears analogous to maximizing joint stiffness. The approach of optimizing for maximum moment generated by individual muscles may be a good candidate for developing objective functions that accurately simulate muscle coactivation in complex joints. This new approach will be the focus of future studies with human subjects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Application of Land Surface Data Assimilation to Simulations of Sea Breeze Circulations

    NASA Technical Reports Server (NTRS)

    Mackaro, Scott; Lapenta, William M.; Blackwell, Keith; Suggs, Ron; McNider, Richard T.; Jedlovec, Gary; Kimball, Sytske

    2003-01-01

    A technique has been developed for assimilating GOES-derived skin temperature tendencies and insolation into the surface energy budget equation of a mesoscale model so that the simulated rate of temperature change closely agrees with the satellite observations. A critical assumption of the technique is that the availability of moisture (either from the soil or vegetation) is the least known term in the model's surface energy budget. Therefore, the simulated latent heat flux, which is a function of surface moisture availability, is adjusted based upon differences between the modeled and satellite- observed skin temperature tendencies. An advantage of this technique is that satellite temperature tendencies are assimilated in an energetically consistent manner that avoids energy imbalances and surface stability problems that arise from direct assimilation of surface shelter temperatures. The fact that the rate of change of the satellite skin temperature is used rather than the absolute temperature means that sensor calibration is not as critical. The sea/land breeze is a well-documented mesoscale circulation that affects many coastal areas of the world including the northern Gulf Coast of the United States. The focus of this paper is to examine how the satellite assimilation technique impacts the simulation of a sea breeze circulation observed along the Mississippi/Alabama coast in the spring of 2001. The technique is implemented within the PSUNCAR MM5 V3-5 and applied at spatial resolutions of 12- and 4-km. It is recognized that even 4-km grid spacing is too coarse to explicitly resolve the detailed, mesoscale structure of sea breezes. Nevertheless, the model can forecast certain characteristics of the observed sea breeze including a thermally direct circulation that results from differential low-level heating across the land-sea interface. Our intent is to determine the sensitivity of the circulation to the differential land surface forcing produced via the assimilation of GOES skin temperature tendencies. Results will be quantified through statistical verification techniques.

  5. Advances in Heavy Ion Beam Probe Technology and Operation on MST

    NASA Astrophysics Data System (ADS)

    Demers, D. R.; Connor, K. A.; Schoch, P. M.; Radke, R. J.; Anderson, J. K.; Craig, D.; den Hartog, D. J.

    2003-10-01

    A technique to map the magnetic field of a plasma via spectral imaging is being developed with the Heavy Ion Beam Probe on the Madison Symmetric Torus. The technique will utilize two-dimensional images of the ion beam in the plasma, acquired by two CCD cameras, to generate a three-dimensional reconstruction of the beam trajectory. This trajectory, and the known beam ion mass, energy and charge-state, will be used to determine the magnetic field of the plasma. A suitable emission line has not yet been observed since radiation from the MST plasma is both broadband and intense. An effort to raise the emission intensity from the ion beam by increasing beam focus and current has been undertaken. Simulations of the accelerator ion optics and beam characteristics led to a technique, confirmed by experiment, that achieves a narrower beam and marked increase in ion current near the plasma surface. The improvements arising from these simulations will be discussed. Realization of the magnetic field mapping technique is contingent upon accurate reconstruction of the beam trajectory from the camera images. Simulations of two camera CCD images, including the interior of MST, its various landmarks and beam trajectories have been developed. These simulations accept user input such as camera locations, resolution via pixellization and noise. The quality of the images simulated with these and other variables will help guide the selection of viewing port pairs, image size and camera specifications. The results of these simulations will be presented.

  6. Innovations in surgery simulation: a review of past, current and future techniques

    PubMed Central

    Burtt, Karen; Solorzano, Carlos A.; Carey, Joseph N.

    2016-01-01

    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become integral components of medical training in the future. Evidence from the literature highlights the benefits of including simulations in surgical training; skills acquired through simulations translate into improvements in operating room performance. Moreover, simulations are rapidly incorporating new medical technologies and offer increasingly high-fidelity recreations of procedures. As a result, both novice and expert surgeons are able to benefit from their use. As dedicated, structured curricula are developed that incorporate simulations into daily resident training, simulated surgeries will strengthen the surgeon’s skill set, decrease hospital costs, and improve patient outcomes. PMID:28090509

  7. Innovations in surgery simulation: a review of past, current and future techniques.

    PubMed

    Badash, Ido; Burtt, Karen; Solorzano, Carlos A; Carey, Joseph N

    2016-12-01

    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become integral components of medical training in the future. Evidence from the literature highlights the benefits of including simulations in surgical training; skills acquired through simulations translate into improvements in operating room performance. Moreover, simulations are rapidly incorporating new medical technologies and offer increasingly high-fidelity recreations of procedures. As a result, both novice and expert surgeons are able to benefit from their use. As dedicated, structured curricula are developed that incorporate simulations into daily resident training, simulated surgeries will strengthen the surgeon's skill set, decrease hospital costs, and improve patient outcomes.

  8. Parallel discrete event simulation using shared memory

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1988-01-01

    With traditional event-list techniques, evaluating a detailed discrete-event simulation-model can often require hours or even days of computation time. By eliminating the event list and maintaining only sufficient synchronization to ensure causality, parallel simulation can potentially provide speedups that are linear in the numbers of processors. A set of shared-memory experiments, using the Chandy-Misra distributed-simulation algorithm, to simulate networks of queues is presented. Parameters of the study include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential-simulation of most queueing network models.

  9. Parallel computational fluid dynamics '91; Conference Proceedings, Stuttgart, Germany, Jun. 10-12, 1991

    NASA Technical Reports Server (NTRS)

    Reinsch, K. G. (Editor); Schmidt, W. (Editor); Ecer, A. (Editor); Haeuser, Jochem (Editor); Periaux, J. (Editor)

    1992-01-01

    A conference was held on parallel computational fluid dynamics and produced related papers. Topics discussed in these papers include: parallel implicit and explicit solvers for compressible flow, parallel computational techniques for Euler and Navier-Stokes equations, grid generation techniques for parallel computers, and aerodynamic simulation om massively parallel systems.

  10. Computational Issues Associated with Temporally Deforming Geometries Such as Thrust Vectoring Nozzles

    NASA Technical Reports Server (NTRS)

    Boyalakuntla, Kishore; Soni, Bharat K.; Thornburg, Hugh J.; Yu, Robert

    1996-01-01

    During the past decade, computational simulation of fluid flow around complex configurations has progressed significantly and many notable successes have been reported, however, unsteady time-dependent solutions are not easily obtainable. The present effort involves unsteady time dependent simulation of temporally deforming geometries. Grid generation for a complex configuration can be a time consuming process and temporally varying geometries necessitate the regeneration of such grids for every time step. Traditional grid generation techniques have been tried and demonstrated to be inadequate to such simulations. Non-Uniform Rational B-splines (NURBS) based techniques provide a compact and accurate representation of the geometry. This definition can be coupled with a distribution mesh for a user defined spacing. The present method greatly reduces cpu requirements for time dependent remeshing, facilitating the simulation of more complex unsteady problems. A thrust vectoring nozzle has been chosen to demonstrate the capability as it is of current interest in the aerospace industry for better maneuverability of fighter aircraft in close combat and in post stall regimes. This current effort is the first step towards multidisciplinary design optimization which involves coupling the aerodynamic heat transfer and structural analysis techniques. Applications include simulation of temporally deforming bodies and aeroelastic problems.

  11. Applications of Low Density Flow Techniques and Catalytic Recombination at the Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Scott, Carl D.

    2000-01-01

    The talk presents a brief background on defInitions of catalysis and effects associated with chemically nonequilibrium and low-density flows of aerospace interest. Applications of catalytic recombination on surfaces in dissociated flow are given, including aero heating on reentry spacecraft thermal protection surfaces and reflection of plume flow on pressure distributions associated with the space station. Examples include aero heating predictions for the X-38 test vehicle, the inlet of a proposed gas-sampling probe used in high enthalpy test facilities, and a parabolic body at angle of attack. The effect of accommodation coefficients on thruster induced pressure distributions is also included. Examples of tools used include simple aero heating formulas based on boundary layer solutions, an engineering approximation that uses axisymmetric viscous shock layer flow to simulate full three dimensional flow, full computational fluid dynamics, and direct simulation Monte-Carlo calculations. Methods of determining catalytic recombination rates in arc jet flow are discus ed. An area of catalysis not fully understood is the formation of single-wall carbon nanotubes (SWNT) with gas phase or nano-size metal particles. The Johnson Space Center is making SWNTs using both a laser ablation technique and an electric arc vaporization technique.

  12. Progress Toward an Efficient and General CFD Tool for Propulsion Design/Analysis

    NASA Technical Reports Server (NTRS)

    Cox, C. F.; Cinnella, P.; Westmoreland, S.

    1996-01-01

    The simulation of propulsive flows inherently involves chemical activity. Recent years have seen substantial strides made in the development of numerical schemes for reacting flowfields, in particular those involving finite-rate chemistry. However, finite-rate calculations are computationally intensive and require knowledge of the actual kinetics, which are not always known with sufficient accuracy. Alternatively, flow simulations based on the assumption of local chemical equilibrium are capable of obtaining physically reasonable results at far less computational cost. The present study summarizes the development of efficient numerical techniques for the simulation of flows in local chemical equilibrium, whereby a 'Black Box' chemical equilibrium solver is coupled to the usual gasdynamic equations. The generalization of the methods enables the modelling of any arbitrary mixture of thermally perfect gases, including air, combustion mixtures and plasmas. As demonstration of the potential of the methodologies, several solutions, involving reacting and perfect gas flows, will be presented. Included is a preliminary simulation of the SSME startup transient. Future enhancements to the proposed techniques will be discussed, including more efficient finite-rate and hybrid (partial equilibrium) schemes. The algorithms that have been developed and are being optimized provide for an efficient and general tool for the design and analysis of propulsion systems.

  13. Comparative evaluation of twenty pilot workload assessment measure using a psychomotor task in a moving base aircraft simulator

    NASA Technical Reports Server (NTRS)

    Connor, S. A.; Wierwille, W. W.

    1983-01-01

    A comparison of the sensitivity and intrusion of twenty pilot workload assessment techniques was conducted using a psychomotor loading task in a three degree of freedom moving base aircraft simulator. The twenty techniques included opinion measures, spare mental capacity measures, physiological measures, eye behavior measures, and primary task performance measures. The primary task was an instrument landing system (ILS) approach and landing. All measures were recorded between the outer marker and the middle marker on the approach. Three levels (low, medium, and high) of psychomotor load were obtained by the combined manipulation of windgust disturbance level and simulated aircraft pitch stability. Six instrument rated pilots participated in four seasons lasting approximately three hours each.

  14. A simulation study of emergency lunar escape to orbit using several simplified manual guidance and control techniques

    NASA Technical Reports Server (NTRS)

    Middleton, D. B.; Hurt, G. J., Jr.

    1971-01-01

    A fixed-base piloted simulator investigation has been made of the feasibility of using any of several manual guidance and control techniques for emergency lunar escape to orbit with very simplified, lightweight vehicle systems. The escape-to-orbit vehicles accommodate two men, but one man performs all of the guidance and control functions. Three basic attitude-control modes and four manually executed trajectory-guidance schemes were used successfully during approximately 125 simulated flights under a variety of conditions. These conditions included thrust misalinement, uneven propellant drain, and a vehicle moment-of-inertia range of 250 to 12,000 slugs per square foot. Two types of results are presented - orbit characteristics and pilot ratings of vehicle handling qualities.

  15. Evaluating "Baby Think It Over" Infant Simulators: A Comparison Group Study

    ERIC Educational Resources Information Center

    Barnett, Jerrold E.

    2006-01-01

    To test the efficacy of Baby-Think-It-Over (BTIO) infant simulators, two versions of a sexuality education program were compared. While the program was designed to include BTIO as an important teaching technique, two schools (49 students) opted not to use them. These students completed all elements of the program except the BTIO activities. Their…

  16. Decision rules for unbiased inventory estimates

    NASA Technical Reports Server (NTRS)

    Argentiero, P. D.; Koch, D.

    1979-01-01

    An efficient and accurate procedure for estimating inventories from remote sensing scenes is presented. In place of the conventional and expensive full dimensional Bayes decision rule, a one-dimensional feature extraction and classification technique was employed. It is shown that this efficient decision rule can be used to develop unbiased inventory estimates and that for large sample sizes typical of satellite derived remote sensing scenes, resulting accuracies are comparable or superior to more expensive alternative procedures. Mathematical details of the procedure are provided in the body of the report and in the appendix. Results of a numerical simulation of the technique using statistics obtained from an observed LANDSAT scene are included. The simulation demonstrates the effectiveness of the technique in computing accurate inventory estimates.

  17. Wide range operation of advanced low NOx aircraft gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Roberts, P. B.; Fiorito, R. J.; Butze, H. F.

    1978-01-01

    The paper summarizes the results of an experimental test rig program designed to define and demonstrates techniques which would allow the jet-induced circulation and vortex air blast combustors to operate stably with acceptable emissions at simulated engine idle without compromise to the low NOx emissions under the high-altitude supersonic cruise condition. The discussion focuses on the test results of the key combustor modifications for both the simulated engine idle and cruise conditions. Several range-augmentation techniques are demonstrated that allow the lean-reaction premixed aircraft gas turbine combustor to operate with low NOx emissons at engine cruise and acceptable CO and UHC levels at engine idle. These techniques involve several combinations, including variable geometry and fuel switching designs.

  18. Health Science Education

    ERIC Educational Resources Information Center

    Hartsell, Horace C.

    1970-01-01

    Briefly describes several instructional techniques including computer aid simulation of the medical encounter, media-biased approaches for teaching doctor-patient relationships, and programed media for teaching decision-making to nursing students." (Author/AA)

  19. On testing VLSI chips for the big Viterbi decoder

    NASA Technical Reports Server (NTRS)

    Hsu, I. S.

    1989-01-01

    A general technique that can be used in testing very large scale integrated (VLSI) chips for the Big Viterbi Decoder (BVD) system is described. The test technique is divided into functional testing and fault-coverage testing. The purpose of functional testing is to verify that the design works functionally. Functional test vectors are converted from outputs of software simulations which simulate the BVD functionally. Fault-coverage testing is used to detect and, in some cases, to locate faulty components caused by bad fabrication. This type of testing is useful in screening out bad chips. Finally, design for testability, which is included in the BVD VLSI chip design, is described in considerable detail. Both the observability and controllability of a VLSI chip are greatly enhanced by including the design for the testability feature.

  20. Recent research related to prediction of stall/spin characteristics of fighter aircraft

    NASA Technical Reports Server (NTRS)

    Nguyen, L. T.; Anglin, E. L.; Gilbert, W. P.

    1976-01-01

    The NASA Langley Research Center is currently engaged in a stall/spin research program to provide the fundamental information and design guidelines required to predict the stall/spin characteristics of fighter aircraft. The prediction methods under study include theoretical spin prediction techniques and piloted simulation studies. The paper discusses the overall status of theoretical techniques including: (1) input data requirements, (2) math model requirements, and (3) correlation between theoretical and experimental results. The Langley Differential Maneuvering Simulator (DMS) facility has been used to evaluate the spin susceptibility of several current fighters during typical air combat maneuvers and to develop and evaluate the effectiveness of automatic departure/spin prevention concepts. The evaluation procedure is described and some of the more significant results of the studies are presented.

  1. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    NASA Astrophysics Data System (ADS)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  2. Application of multi-objective nonlinear optimization technique for coordinated ramp-metering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haj Salem, Habib; Farhi, Nadir; Lebacque, Jean Patrick, E-mail: abib.haj-salem@ifsttar.fr, E-mail: nadir.frahi@ifsttar.fr, E-mail: jean-patrick.lebacque@ifsttar.fr

    2015-03-10

    This paper aims at developing a multi-objective nonlinear optimization algorithm applied to coordinated motorway ramp metering. The multi-objective function includes two components: traffic and safety. Off-line simulation studies were performed on A4 France Motorway including 4 on-ramps.

  3. A mechanical adapter for installing mission equipment on large space structures

    NASA Technical Reports Server (NTRS)

    Lefever, A. E.; Totah, R. S.

    1980-01-01

    A mechanical attachment adapter was designed, constructed, and tested. The adapter was was included in a simulation program that investigated techniques for assembling erectable structures under simulated zero-g conditions by pressure-suited subjects in a simulated EVA mode. The adapter was utilized as an interface attachment between a simulated equipment module and one node point of a tetrahedral structural cell. The mating performance of the adapter, a self-energized mechanism, was easily and quickly demonstrated and required little effort on the part of the test subjects.

  4. Jurassic Diabase from Leesburg, VA: A Proposed Lunar Simulant

    NASA Technical Reports Server (NTRS)

    Taylor, Patrick T.; Lowman, P. D.; Nagihara, Seiichi; Milam, M. B.; Nakamura, Yosio

    2008-01-01

    A study of future lunar seismology and heat flow is being carried out as part of the NASA Lunar Sortie Science Program. This study will include new lunar drilling techniques, using a regolith simulant, for emplacement of instruments. Previous lunar simulants, such as JSC-1 and MLS-1, were not available when the study began, so a local simulant source was required. Diabase from a quarry at Leeseburg, Virginia, was obtained from the Luck Stone Corporation. We report here initial results of a petrographic examination of this rock, GSC-1 henceforth.

  5. Jurassic Diabase from Leesburg, VA: A Proposed Lunar Simulant

    NASA Technical Reports Server (NTRS)

    Taylor, P. T.; Lowman, P. D.; Nagihara, Seiichi; Milam, M. B.; Nakamura, Yosio

    2008-01-01

    A study of future lunar seismology and heat flow is being carried out as part of the NASA Lunar Sortie Science Program [1].This study will include new lunar drilling techniques, using a regolith simulant, for emplacement of instruments. Previous lunar simulants, such as JSC-I and MLS-l, were not available when the study began, so a local simulant source was required. Diabase from a quarry at Leesburg, Virginia, was obtained from the Luck Stone Corporation. We report here initial results of a petrographic examination of this rock, GSC-1 henceforth.

  6. Using Simulations to Investigate Decision Making in Airline Operations

    NASA Technical Reports Server (NTRS)

    Bruce, Peter J.; Gray, Judy H.

    2003-01-01

    This paper examines a range of methods to collect data for the investigation of decision-making in airline Operations Control Centres (OCCs). A study was conducted of 52 controllers in five OCCs of both domestic and international airlines in the Asia-Pacific region. A range of methods was used including: surveys, interviews, observations, simulations, and think-aloud protocol. The paper compares and evaluates the suitability of these techniques for gathering data and provides recommendations on the application of simulations. Keywords Data Collection, Decision-Making, Research Methods, Simulation, Think-Aloud Protocol.

  7. Development of an OSSE Framework for a Global Atmospheric Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald; Errico, Ronald M.; Prive, N.

    2012-01-01

    Observing system simulation experiments (OSSEs) are powerful tools for estimating the usefulness of various configurations of envisioned observing systems and data assimilation techniques. Their utility stems from their being conducted in an entirely simulated context, utilizing simulated observations having simulated errors and drawn from a simulation of the earth's environment. Observations are generated by applying physically based algorithms to the simulated state, such as performed during data assimilation or using other appropriate algorithms. Adding realistic instrument plus representativeness errors, including their biases and correlations, can be critical for obtaining realistic assessments of the impact of a proposed observing system or analysis technique. If estimates of the expected accuracy of proposed observations are realistic, then the OSSE can be also used to learn how best to utilize the new information, accelerating its transition to operations once the real data are available. As with any inferences from simulations, however, it is first imperative that some baseline OSSEs are performed and well validated against corresponding results obtained with a real observing system. This talk provides an overview of, and highlights critical issues related to, the development of an OSSE framework for the tropospheric weather prediction component of the NASA GEOS-5 global atmospheric data assimilation system. The framework includes all existing observations having significant impact on short-term forecast skill. Its validity has been carefully assessed using a range of metrics that can be evaluated in both the OSSE and real contexts, including adjoint-based estimates of observation impact. A preliminary application to the Aeolus Doppler wind lidar mission, scheduled for launch by the European Space Agency in 2014, has also been investigated.

  8. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  9. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE PAGES

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...

    2017-09-01

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  10. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  11. A data-driven dynamics simulation framework for railway vehicles

    NASA Astrophysics Data System (ADS)

    Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2018-03-01

    The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.

  12. Upgrades for the CMS simulation

    DOE PAGES

    Lange, D. J.; Hildreth, M.; Ivantchenko, V. N.; ...

    2015-05-22

    Over the past several years, the CMS experiment has made significant changes to its detector simulation application. The geometry has been generalized to include modifications being made to the CMS detector for 2015 operations, as well as model improvements to the simulation geometry of the current CMS detector and the implementation of a number of approved and possible future detector configurations. These include both completely new tracker and calorimetry systems. We have completed the transition to Geant4 version 10, we have made significant progress in reducing the CPU resources required to run our Geant4 simulation. These have been achieved throughmore » both technical improvements and through numerical techniques. Substantial speed improvements have been achieved without changing the physics validation benchmarks that the experiment uses to validate our simulation application for use in production. As a result, we will discuss the methods that we implemented and the corresponding demonstrated performance improvements deployed for our 2015 simulation application.« less

  13. The transesophageal echocardiography simulator based on computed tomography images.

    PubMed

    Piórkowski, Adam; Kempny, Aleksander

    2013-02-01

    Simulators are a new tool in education in many fields, including medicine, where they greatly improve familiarity with medical procedures, reduce costs, and, importantly, cause no harm to patients. This is so in the case of transesophageal echocardiography (TEE), in which the use of a simulator facilitates spatial orientation and helps in case studies. The aim of the project described in this paper is to simulate an examination by TEE. This research makes use of available computed tomography data to simulate the corresponding echocardiographic view. This paper describes the essential characteristics that distinguish these two modalities and the key principles of the wave phenomena that should be considered in the simulation process, taking into account the conditions specific to the echocardiography. The construction of the CT2TEE (Web-based TEE simulator) is also presented. The considerations include ray-tracing and ray-casting techniques in the context of ultrasound beam and artifact simulation. An important aspect of the interaction with the user is raised.

  14. Extending radiative transfer models by use of Bayes rule. [in atmospheric science

    NASA Technical Reports Server (NTRS)

    Whitney, C.

    1977-01-01

    This paper presents a procedure that extends some existing radiative transfer modeling techniques to problems in atmospheric science where curvature and layering of the medium and dynamic range and angular resolution of the signal are important. Example problems include twilight and limb scan simulations. Techniques that are extended include successive orders of scattering, matrix operator, doubling, Gauss-Seidel iteration, discrete ordinates and spherical harmonics. The procedure for extending them is based on Bayes' rule from probability theory.

  15. Study and simulation of low rate video coding schemes

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Kipp, G.

    1992-01-01

    The semiannual report is included. Topics covered include communication, information science, data compression, remote sensing, color mapped images, robust coding scheme for packet video, recursively indexed differential pulse code modulation, image compression technique for use on token ring networks, and joint source/channel coder design.

  16. Simulation training and resident performance of singleton vaginal breech delivery.

    PubMed

    Deering, Shad; Brown, Jill; Hodor, Jonathon; Satin, Andrew J

    2006-01-01

    To determine whether simulation training improves resident competency in the management of a simulated vaginal breech delivery. Without advance notice or training, residents from 2 obstetrics and gynecology residency programs participated in a standardized simulation scenario of management of an imminent term vaginal breech delivery. The scenario used an obstetric birth simulator and human actors, with the encounters digitally recorded. Residents then received a training session with the simulator on the proper techniques for vaginal breech delivery. Two weeks later they were retested using a similar simulation scenario. A physician, blinded to training status, graded the residents' performance using a standardized evaluation sheet. Statistical analysis included the Wilcoxon signed rank test, McNemar chi2, regression analysis, and paired t test as appropriate with a P value of less than .05 considered significant. Twenty residents from 2 institutions completed all parts of the study protocol. Trained residents had significantly higher scores in 8 of 12 critical delivery components (P < .05). Overall performance of the delivery and safety in performing the delivery also improved significantly (P = .001 for both). Simulation training improved resident performance in the management of a simulated vaginal breech delivery. Performance of a term breech vaginal delivery is well suited for simulation training, because it is uncommon and inevitable, and improper technique may result in significant injury. II-2.

  17. RFA Guardian: Comprehensive Simulation of Radiofrequency Ablation Treatment of Liver Tumors.

    PubMed

    Voglreiter, Philip; Mariappan, Panchatcharam; Pollari, Mika; Flanagan, Ronan; Blanco Sequeiros, Roberto; Portugaller, Rupert Horst; Fütterer, Jurgen; Schmalstieg, Dieter; Kolesnik, Marina; Moche, Michael

    2018-01-15

    The RFA Guardian is a comprehensive application for high-performance patient-specific simulation of radiofrequency ablation of liver tumors. We address a wide range of usage scenarios. These include pre-interventional planning, sampling of the parameter space for uncertainty estimation, treatment evaluation and, in the worst case, failure analysis. The RFA Guardian is the first of its kind that exhibits sufficient performance for simulating treatment outcomes during the intervention. We achieve this by combining a large number of high-performance image processing, biomechanical simulation and visualization techniques into a generalized technical workflow. Further, we wrap the feature set into a single, integrated application, which exploits all available resources of standard consumer hardware, including massively parallel computing on graphics processing units. This allows us to predict or reproduce treatment outcomes on a single personal computer with high computational performance and high accuracy. The resulting low demand for infrastructure enables easy and cost-efficient integration into the clinical routine. We present a number of evaluation cases from the clinical practice where users performed the whole technical workflow from patient-specific modeling to final validation and highlight the opportunities arising from our fast, accurate prediction techniques.

  18. Simulation of wind turbine wakes using the actuator line technique

    PubMed Central

    Sørensen, Jens N.; Mikkelsen, Robert F.; Henningson, Dan S.; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J.

    2015-01-01

    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. PMID:25583862

  19. A technique to remove the tensile instability in weakly compressible SPH

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoyang; Yu, Peng

    2018-01-01

    When smoothed particle hydrodynamics (SPH) is directly applied for the numerical simulations of transient viscoelastic free surface flows, a numerical problem called tensile instability arises. In this paper, we develop an optimized particle shifting technique to remove the tensile instability in SPH. The basic equations governing free surface flow of an Oldroyd-B fluid are considered, and approximated by an improved SPH scheme. This includes the implementations of the correction of kernel gradient and the introduction of Rusanov flux into the continuity equation. To verify the effectiveness of the optimized particle shifting technique in removing the tensile instability, the impacting drop, the injection molding of a C-shaped cavity, and the extrudate swell, are conducted. The numerical results obtained are compared with those simulated by other numerical methods. A comparison among different numerical techniques (e.g., the artificial stress) to remove the tensile instability is further performed. All numerical results agree well with the available data.

  20. Atomic oxygen effects on thin film space coatings studied by spectroscopic ellipsometry, atomic force microscopy, and laser light scattering

    NASA Technical Reports Server (NTRS)

    Synowicki, R. A.; Hale, Jeffrey S.; Woollam, John A.

    1992-01-01

    The University of Nebraska is currently evaluating Low Earth Orbit (LEO) simulation techniques as well as a variety of thin film protective coatings to withstand atomic oxygen (AO) degradation. Both oxygen plasma ashers and an electron cyclotron resonance (ECR) source are being used for LEO simulation. Thin film coatings are characterized by optical techniques including Variable Angle Spectroscopic Ellipsometry, Optical spectrophotometry, and laser light scatterometry. Atomic Force Microscopy (AFM) is also used to characterize surface morphology. Results on diamondlike carbon (DLC) films show that DLC degrades with simulated AO exposure at a rate comparable to Kapton polyimide. Since DLC is not as susceptible to environmental factors such as moisture absorption, it could potentially provide more accurate measurements of AO fluence on short space flights.

  1. Steady-State Electrodiffusion from the Nernst-Planck Equation Coupled to Local Equilibrium Monte Carlo Simulations.

    PubMed

    Boda, Dezső; Gillespie, Dirk

    2012-03-13

    We propose a procedure to compute the steady-state transport of charged particles based on the Nernst-Planck (NP) equation of electrodiffusion. To close the NP equation and to establish a relation between the concentration and electrochemical potential profiles, we introduce the Local Equilibrium Monte Carlo (LEMC) method. In this method, Grand Canonical Monte Carlo simulations are performed using the electrochemical potential specified for the distinct volume elements. An iteration procedure that self-consistently solves the NP and flux continuity equations with LEMC is shown to converge quickly. This NP+LEMC technique can be used in systems with diffusion of charged or uncharged particles in complex three-dimensional geometries, including systems with low concentrations and small applied voltages that are difficult for other particle simulation techniques.

  2. An introduction to three-dimensional climate modeling

    NASA Technical Reports Server (NTRS)

    Washington, W. M.; Parkinson, C. L.

    1986-01-01

    The development and use of three-dimensional computer models of the earth's climate are discussed. The processes and interactions of the atmosphere, oceans, and sea ice are examined. The basic theory of climate simulation which includes the fundamental equations, models, and numerical techniques for simulating the atmosphere, oceans, and sea ice is described. Simulated wind, temperature, precipitation, ocean current, and sea ice distribution data are presented and compared to observational data. The responses of the climate to various environmental changes, such as variations in solar output or increases in atmospheric carbon dioxide, are modeled. Future developments in climate modeling are considered. Information is also provided on the derivation of the energy equation, the finite difference barotropic forecast model, the spectral transform technique, and the finite difference shallow water waved equation model.

  3. Modeling the Environmental Impact of Air Traffic Operations

    NASA Technical Reports Server (NTRS)

    Chen, Neil

    2011-01-01

    There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.

  4. Model-Based Economic Evaluation of Treatments for Depression: A Systematic Literature Review.

    PubMed

    Kolovos, Spyros; Bosmans, Judith E; Riper, Heleen; Chevreul, Karine; Coupé, Veerle M H; van Tulder, Maurits W

    2017-09-01

    An increasing number of model-based studies that evaluate the cost effectiveness of treatments for depression are being published. These studies have different characteristics and use different simulation methods. We aimed to systematically review model-based studies evaluating the cost effectiveness of treatments for depression and examine which modelling technique is most appropriate for simulating the natural course of depression. The literature search was conducted in the databases PubMed, EMBASE and PsycInfo between 1 January 2002 and 1 October 2016. Studies were eligible if they used a health economic model with quality-adjusted life-years or disability-adjusted life-years as an outcome measure. Data related to various methodological characteristics were extracted from the included studies. The available modelling techniques were evaluated based on 11 predefined criteria. This methodological review included 41 model-based studies, of which 21 used decision trees (DTs), 15 used cohort-based state-transition Markov models (CMMs), two used individual-based state-transition models (ISMs), and three used discrete-event simulation (DES) models. Just over half of the studies (54%) evaluated antidepressants compared with a control condition. The data sources, time horizons, cycle lengths, perspectives adopted and number of health states/events all varied widely between the included studies. DTs scored positively in four of the 11 criteria, CMMs in five, ISMs in six, and DES models in seven. There were substantial methodological differences between the studies. Since the individual history of each patient is important for the prognosis of depression, DES and ISM simulation methods may be more appropriate than the others for a pragmatic representation of the course of depression. However, direct comparisons between the available modelling techniques are necessary to yield firm conclusions.

  5. Cryotherapy simulator for localized prostate cancer.

    PubMed

    Hahn, James K; Manyak, Michael J; Jin, Ge; Kim, Dongho; Rewcastle, John; Kim, Sunil; Walsh, Raymond J

    2002-01-01

    Cryotherapy is a treatment modality that uses a technique to selectively freeze tissue and thereby cause controlled tissue destruction. The procedure involves placement of multiple small diameter probes through the perineum into the prostate tissue at selected spatial intervals. Transrectal ultrasound is used to properly position the cylindrical probes before activation of the liquid Argon cooling element, which lowers the tissue temperature below -40 degrees Centigrade. Tissue effect is monitored by transrectal ultrasound changes as well as thermocouples placed in the tissue. The computer-based cryotherapy simulation system mimics the major surgical steps involved in the procedure. The simulated real-time ultrasound display is generated from 3-D ultrasound datasets where the interaction of the ultrasound with the instruments as well as the frozen tissue is simulated by image processing. The thermal and mechanical simulations of the tissue are done using a modified finite-difference/finite-element method optimized for real-time performance. The simulator developed is a part of a comprehensive training program, including a computer-based learning system and hands-on training program with a proctor, designed to familiarize the physician with the technique and equipment involved.

  6. A Generic Inner-Loop Control Law Structure for Six-Degree-of-Freedom Conceptual Aircraft Design

    NASA Technical Reports Server (NTRS)

    Cox, Timothy H.; Cotting, M. Christopher

    2005-01-01

    A generic control system framework for both real-time and batch six-degree-of-freedom simulations is presented. This framework uses a simplified dynamic inversion technique to allow for stabilization and control of any type of aircraft at the pilot interface level. The simulation, designed primarily for the real-time simulation environment, also can be run in a batch mode through a simple guidance interface. Direct vehicle-state acceleration feedback is required with the simplified dynamic inversion technique. The estimation of surface effectiveness within real-time simulation timing constraints also is required. The generic framework provides easily modifiable control variables, allowing flexibility in the variables that the pilot commands. A direct control allocation scheme is used to command aircraft effectors. Primary uses for this system include conceptual and preliminary design of aircraft, when vehicle models are rapidly changing and knowledge of vehicle six-degree-of-freedom performance is required. A simulated airbreathing hypersonic vehicle and simulated high-performance fighter aircraft are used to demonstrate the flexibility and utility of the control system.

  7. A Generic Inner-Loop Control Law Structure for Six-Degree-of-Freedom Conceptual Aircraft Design

    NASA Technical Reports Server (NTRS)

    Cox, Timothy H.; Cotting, Christopher

    2005-01-01

    A generic control system framework for both real-time and batch six-degree-of-freedom (6-DOF) simulations is presented. This framework uses a simplified dynamic inversion technique to allow for stabilization and control of any type of aircraft at the pilot interface level. The simulation, designed primarily for the real-time simulation environment, also can be run in a batch mode through a simple guidance interface. Direct vehicle-state acceleration feedback is required with the simplified dynamic inversion technique. The estimation of surface effectiveness within real-time simulation timing constraints also is required. The generic framework provides easily modifiable control variables, allowing flexibility in the variables that the pilot commands. A direct control allocation scheme is used to command aircraft effectors. Primary uses for this system include conceptual and preliminary design of aircraft, when vehicle models are rapidly changing and knowledge of vehicle 6-DOF performance is required. A simulated airbreathing hypersonic vehicle and simulated high-performance fighter aircraft are used to demonstrate the flexibility and utility of the control system.

  8. Support Center for Regulatory Atmospheric Modeling (SCRAM)

    EPA Pesticide Factsheets

    This technical site provides access to air quality models (including computer code, input data, and model processors) and other mathematical simulation techniques used in assessing air emissions control strategies and source impacts.

  9. Simulating Optical Fibers.

    ERIC Educational Resources Information Center

    Edgar, Dale

    1988-01-01

    Described is a demonstration of Snell's law using a laser beam and an optical fiber. Provided are the set-up method of the demonstration apparatus and some practical suggestions including "index matching" technique using vaseline. (YP)

  10. Proposal and verification numerical simulation for a microwave forward scattering technique at upper hybrid resonance for the measurement of electron gyroscale density fluctuations in the electron cyclotron frequency range in magnetized plasmas

    NASA Astrophysics Data System (ADS)

    Kawamori, E.; Igami, H.

    2017-11-01

    A diagnostic technique for detecting the wave numbers of electron density fluctuations at electron gyro-scales in an electron cyclotron frequency range is proposed, and the validity of the idea is checked by means of a particle-in-cell (PIC) numerical simulation. The technique is a modified version of the scattering technique invented by Novik et al. [Plasma Phys. Controlled Fusion 36, 357-381 (1994)] and Gusakov et al., [Plasma Phys. Controlled Fusion 41, 899-912 (1999)]. The novel method adopts forward scattering of injected extraordinary probe waves at the upper hybrid resonance layer instead of the backward-scattering adopted by the original method, enabling the measurement of the wave-numbers of the fine scale density fluctuations in the electron-cyclotron frequency band by means of phase measurement of the scattered waves. The verification numerical simulation with the PIC method shows that the technique has a potential to be applicable to the detection of electron gyro-scale fluctuations in laboratory plasmas if the upper-hybrid resonance layer is accessible to the probe wave. The technique is a suitable means to detect electron Bernstein waves excited via linear mode conversion from electromagnetic waves in torus plasma experiments. Through the numerical simulations, some problems that remain to be resolved are revealed, which include the influence of nonlinear processes such as the parametric decay instability of the probe wave in the scattering process, and so on.

  11. 3D integrated HYDRA simulations of hohlraums including fill tubes

    NASA Astrophysics Data System (ADS)

    Marinak, M. M.; Milovich, J.; Hammel, B. A.; Macphee, A. G.; Smalyuk, V. A.; Kerbel, G. D.; Sepke, S.; Patel, M. V.

    2017-10-01

    Measurements of fill tube perturbations from hydro growth radiography (HGR) experiments on the National Ignition Facility show spoke perturbations in the ablator radiating from the base of the tube. These correspond to the shadow of the 10 μm diameter glass fill tube cast by hot spots at early time. We present 3D integrated HYDRA simulations of these experiments which include the fill tube. Meshing techniques are described which were employed to resolve the fill tube structure and associated perturbations in the simulations. We examine the extent to which the specific illumination geometry necessary to accommodate a backlighter in the HGR experiment contributes to the spoke pattern. Simulations presented include high resolution calculations run on the Trinity machine operated by the Alliance for Computing at Extreme Scale (ACES) partnership. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344.

  12. SSAGES: Software Suite for Advanced General Ensemble Simulations.

    PubMed

    Sidky, Hythem; Colón, Yamil J; Helfferich, Julian; Sikora, Benjamin J; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S; Reid, Daniel R; Sevgen, Emre; Thapar, Vikram; Webb, Michael A; Whitmer, Jonathan K; de Pablo, Juan J

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques-including adaptive biasing force, string methods, and forward flux sampling-that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  13. Techniques for hot structures testing

    NASA Technical Reports Server (NTRS)

    Deangelis, V. Michael; Fields, Roger A.

    1990-01-01

    Hot structures testing have been going on since the early 1960's beginning with the Mach 6, X-15 airplane. Early hot structures test programs at NASA-Ames-Dryden focused on operational testing required to support the X-15 flight test program, and early hot structures research projects focused on developing lab test techniques to simulate flight thermal profiles. More recent efforts involved numerous large and small hot structures test programs that served to develop test methods and measurement techniques to provide data that promoted the correlation of test data with results from analytical codes. In Nov. 1988 a workshop was sponsored that focused on the correlation of hot structures test data with analysis. Limited material is drawn from the workshop and a more formal documentation is provided of topics that focus on hot structures test techniques used at NASA-Ames-Dryden. Topics covered include the data acquisition and control of testing, the quartz lamp heater systems, current strain and temperature sensors, and hot structures test techniques used to simulate the flight thermal environment in the lab.

  14. C-arm technique using distance driven method for nephrolithiasis and kidney stones detection

    NASA Astrophysics Data System (ADS)

    Malalla, Nuhad; Sun, Pengfei; Chen, Ying; Lipkin, Michael E.; Preminger, Glenn M.; Qin, Jun

    2016-04-01

    Distance driven represents a state of art method that used for reconstruction for x-ray techniques. C-arm tomography is an x-ray imaging technique that provides three dimensional information of the object by moving the C-shaped gantry around the patient. With limited view angle, C-arm system was investigated to generate volumetric data of the object with low radiation dosage and examination time. This paper is a new simulation study with two reconstruction methods based on distance driven including: simultaneous algebraic reconstruction technique (SART) and Maximum Likelihood expectation maximization (MLEM). Distance driven is an efficient method that has low computation cost and free artifacts compared with other methods such as ray driven and pixel driven methods. Projection images of spherical objects were simulated with a virtual C-arm system with a total view angle of 40 degrees. Results show the ability of limited angle C-arm technique to generate three dimensional images with distance driven reconstruction.

  15. NDE and SHM Simulation for CFRP Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Parker, F. Raymond

    2014-01-01

    Ultrasound-based nondestructive evaluation (NDE) is a common technique for damage detection in composite materials. There is a need for advanced NDE that goes beyond damage detection to damage quantification and characterization in order to enable data driven prognostics. The damage types that exist in carbon fiber-reinforced polymer (CFRP) composites include microcracking and delaminations, and can be initiated and grown via impact forces (due to ground vehicles, tool drops, bird strikes, etc), fatigue, and extreme environmental changes. X-ray microfocus computed tomography data, among other methods, have shown that these damage types often result in voids/discontinuities of a complex volumetric shape. The specific damage geometry and location within ply layers affect damage growth. Realistic threedimensional NDE and structural health monitoring (SHM) simulations can aid in the development and optimization of damage quantification and characterization techniques. This paper is an overview of ongoing work towards realistic NDE and SHM simulation tools for composites, and also discusses NASA's need for such simulation tools in aeronautics and spaceflight. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with realistic 3-dimensional damage in CFRP composites. The custom code uses elastodynamic finite integration technique and is parallelized to run efficiently on computing cluster or multicore machines.

  16. An object-oriented simulator for 3D digital breast tomosynthesis imaging system.

    PubMed

    Seyyedi, Saeed; Cengiz, Kubra; Kamasak, Mustafa; Yildirim, Isa

    2013-01-01

    Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values.

  17. An Object-Oriented Simulator for 3D Digital Breast Tomosynthesis Imaging System

    PubMed Central

    Cengiz, Kubra

    2013-01-01

    Digital breast tomosynthesis (DBT) is an innovative imaging modality that provides 3D reconstructed images of breast to detect the breast cancer. Projections obtained with an X-ray source moving in a limited angle interval are used to reconstruct 3D image of breast. Several reconstruction algorithms are available for DBT imaging. Filtered back projection algorithm has traditionally been used to reconstruct images from projections. Iterative reconstruction algorithms such as algebraic reconstruction technique (ART) were later developed. Recently, compressed sensing based methods have been proposed in tomosynthesis imaging problem. We have developed an object-oriented simulator for 3D digital breast tomosynthesis (DBT) imaging system using C++ programming language. The simulator is capable of implementing different iterative and compressed sensing based reconstruction methods on 3D digital tomosynthesis data sets and phantom models. A user friendly graphical user interface (GUI) helps users to select and run the desired methods on the designed phantom models or real data sets. The simulator has been tested on a phantom study that simulates breast tomosynthesis imaging problem. Results obtained with various methods including algebraic reconstruction technique (ART) and total variation regularized reconstruction techniques (ART+TV) are presented. Reconstruction results of the methods are compared both visually and quantitatively by evaluating performances of the methods using mean structural similarity (MSSIM) values. PMID:24371468

  18. Analysis of multiple instructional techniques on the understanding and retention of select mechanical topics

    NASA Astrophysics Data System (ADS)

    Fetsco, Sara Elizabeth

    There are several topics that introductory physics students typically have difficulty understanding. The purpose of this thesis is to investigate if multiple instructional techniques will help students to better understand and retain the material. The three units analyzed in this study are graphing motion, projectile motion, and conservation of momentum. For each unit students were taught using new or altered instructional methods including online laboratory simulations, inquiry labs, and interactive demonstrations. Additionally, traditional instructional methods such as lecture and problem sets were retained. Effectiveness was measured through pre- and post-tests and student opinion surveys. Results suggest that incorporating multiple instructional techniques into teaching will improve student understanding and retention. Students stated that they learned well from all of the instructional methods used except the online simulations.

  19. Reducing statistical uncertainties in simulated organ doses of phantoms immersed in water

    DOE PAGES

    Hiller, Mauritius M.; Veinot, Kenneth G.; Easterly, Clay E.; ...

    2016-08-13

    In this study, methods are addressed to reduce the computational time to compute organ-dose rate coefficients using Monte Carlo techniques. Several variance reduction techniques are compared including the reciprocity method, importance sampling, weight windows and the use of the ADVANTG software package. For low-energy photons, the runtime was reduced by a factor of 10 5 when using the reciprocity method for kerma computation for immersion of a phantom in contaminated water. This is particularly significant since impractically long simulation times are required to achieve reasonable statistical uncertainties in organ dose for low-energy photons in this source medium and geometry. Althoughmore » the MCNP Monte Carlo code is used in this paper, the reciprocity technique can be used equally well with other Monte Carlo codes.« less

  20. Analysis of thin plates with holes by using exact geometrical representation within XFEM.

    PubMed

    Perumal, Logah; Tso, C P; Leng, Lim Thong

    2016-05-01

    This paper presents analysis of thin plates with holes within the context of XFEM. New integration techniques are developed for exact geometrical representation of the holes. Numerical and exact integration techniques are presented, with some limitations for the exact integration technique. Simulation results show that the proposed techniques help to reduce the solution error, due to the exact geometrical representation of the holes and utilization of appropriate quadrature rules. Discussion on minimum order of integration order needed to achieve good accuracy and convergence for the techniques presented in this work is also included.

  1. Diagnostic techniques in deflagration and detonation studies.

    PubMed

    Proud, William G; Williamson, David M; Field, John E; Walley, Stephen M

    2015-12-01

    Advances in experimental, high-speed techniques can be used to explore the processes occurring within energetic materials. This review describes techniques used to study a wide range of processes: hot-spot formation, ignition thresholds, deflagration, sensitivity and finally the detonation process. As this is a wide field the focus will be on small-scale experiments and quantitative studies. It is important that such studies are linked to predictive models, which inform the experimental design process. The stimuli range includes, thermal ignition, drop-weight, Hopkinson Bar and Plate Impact studies. Studies made with inert simulants are also included as these are important in differentiating between reactive response and purely mechanical behaviour.

  2. Optical technique to study the impact of heavy rain on aircraft performance

    NASA Technical Reports Server (NTRS)

    Hess, C. F.; Li, F.

    1985-01-01

    A laser based technique was investigated and shown to have the potential to obtain measurements of the size and velocity of water droplets used in a wind tunnel to simulate rain. A theoretical model was developed which included some simple effects due to droplet nonsphericity. Parametric studies included the variation of collection distance (up to 4 m), angle of collection, effect of beam interference by the spray, and droplet shape. Accurate measurements were obtained under extremely high liquid water content and spray interference. The technique finds applications in the characterization of two phase flows where the size and velocity of particles are needed.

  3. Simulated Real-Life Experiences Using Classified Ads in the Classroom.

    ERIC Educational Resources Information Center

    Hechler, Ellen

    This guide contains activities to help teachers give middle school students experience in practical life skills. Techniques include role playing and using classified advertisements from newspapers. The five lessons include teacher tips on conducting the activities. Lessons contain objectives, materials needed, discussion, and suggested dialogue.…

  4. Adaptive coding of MSS imagery. [Multi Spectral band Scanners

    NASA Technical Reports Server (NTRS)

    Habibi, A.; Samulon, A. S.; Fultz, G. L.; Lumb, D.

    1977-01-01

    A number of adaptive data compression techniques are considered for reducing the bandwidth of multispectral data. They include adaptive transform coding, adaptive DPCM, adaptive cluster coding, and a hybrid method. The techniques are simulated and their performance in compressing the bandwidth of Landsat multispectral images is evaluated and compared using signal-to-noise ratio and classification consistency as fidelity criteria.

  5. Systems Biology in Immunology – A Computational Modeling Perspective

    PubMed Central

    Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.

    2011-01-01

    Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182

  6. Neutron spectrometry for UF 6 enrichment verification in storage cylinders

    DOE PAGES

    Mengesha, Wondwosen; Kiff, Scott D.

    2015-01-29

    Verification of declared UF 6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF 6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF 6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra weremore » analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF 6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF 6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less

  7. Study on the tumor-induced angiogenesis using mathematical models.

    PubMed

    Suzuki, Takashi; Minerva, Dhisa; Nishiyama, Koichi; Koshikawa, Naohiko; Chaplain, Mark Andrew Joseph

    2018-01-01

    We studied angiogenesis using mathematical models describing the dynamics of tip cells. We reviewed the basic ideas of angiogenesis models and its numerical simulation technique to produce realistic computer graphics images of sprouting angiogenesis. We examined the classical model of Anderson-Chaplain using fundamental concepts of mass transport and chemical reaction with ECM degradation included. We then constructed two types of numerical schemes, model-faithful and model-driven ones, where new techniques of numerical simulation are introduced, such as transient probability, particle velocity, and Boolean variables. © 2017 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.

  8. Modeling of Convective-Stratiform Precipitation Processes: Sensitivity to Partitioning Methods and Numerical Advection Schemes

    NASA Technical Reports Server (NTRS)

    Lang, Steve; Tao, W.-K.; Simpson, J.; Ferrier, B.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    Six different convective-stratiform separation techniques, including a new technique that utilizes the ratio of vertical and terminal velocities, are compared and evaluated using two-dimensional numerical simulations of a tropical [Tropical Ocean Global Atmosphere Coupled Ocean-Atmosphere Response Experiment (TOGA COARE)] and midlatitude continental [Preliminary Regional Experiment for STORM-Central (PRESTORM)] squall line. The simulations are made using two different numerical advection schemes: 4th order and positive definite advection. Comparisons are made in terms of rainfall, cloud coverage, mass fluxes, apparent heating and moistening, mean hydrometeor profiles, CFADs (Contoured Frequency with Altitude Diagrams), microphysics, and latent heating retrieval. Overall, it was found that the different separation techniques produced results that qualitatively agreed. However, the quantitative differences were significant. Observational comparisons were unable to conclusively evaluate the performance of the techniques. Latent heating retrieval was shown to be sensitive to the use of separation technique mainly due to the stratiform region for methods that found very little stratiform rain. The midlatitude PRESTORM simulation was found to be nearly invariant with respect to advection type for most quantities while for TOGA COARE fourth order advection produced numerous shallow convective cores and positive definite advection fewer cells that were both broader and deeper penetrating above the freezing level.

  9. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  10. Post-coronagraphic tip-tilt sensing for vortex phase masks: The QACITS technique

    NASA Astrophysics Data System (ADS)

    Huby, E.; Baudoz, P.; Mawet, D.; Absil, O.

    2015-12-01

    Context. Small inner working angle coronagraphs, such as the vortex phase mask, are essential to exploit the full potential of ground-based telescopes in the context of exoplanet detection and characterization. However, the drawback of this attractive feature is a high sensitivity to pointing errors, which degrades the performance of the coronagraph. Aims: We propose a tip-tilt retrieval technique based on the analysis of the final coronagraphic image, hereafter called Quadrant Analysis of Coronagraphic Images for Tip-tilt Sensing (QACITS). Methods: Under the assumption of small phase aberrations, we show that the behavior of the vortex phase mask can be simply described from the entrance pupil to the Lyot stop plane with Zernike polynomials. This convenient formalism is used to establish the theoretical basis of the QACITS technique. We performed simulations to demonstrate the validity and limits of the technique, including the case of a centrally obstructed pupil. Results: The QACITS technique principle is validated with experimental results in the case of an unobstructed circular aperture, as well as simulations in presence of a central obstruction. The typical configuration of the Keck telescope (24% central obstruction) has been simulated with additional high order aberrations. In these conditions, our simulations show that the QACITS technique is still adapted to centrally obstructed pupils and performs tip-tilt retrieval with a precision of 5 × 10-2λ/D when wavefront errors amount to λ/ 14 rms and 10-2λ/D for λ/ 70 rms errors (with λ the wavelength and D the pupil diameter). Conclusions: We have developed and demonstrated a tip-tilt sensing technique for vortex coronagraphs. The implementation of the QACITS technique is based on the analysis of the scientific image and does not require any modification of the original setup. Current facilities equipped with a vortex phase mask can thus directly benefit from this technique to improve the contrast performance close to the axis.

  11. Using Simulation to Improve Systems-Based Practices.

    PubMed

    Gardner, Aimee K; Johnston, Maximilian; Korndorffer, James R; Haque, Imad; Paige, John T

    2017-09-01

    Ensuring the safe, effective management of patients requires efficient processes of care within a smoothly operating system in which highly reliable teams of talented, skilled health care providers are able to use the vast array of high-technology resources and intensive care techniques available. Simulation can play a unique role in exploring and improving the complex perioperative system by proactively identifying latent safety threats and mitigating their damage to ensure that all those who work in this critical health care environment can provide optimal levels of patient care. A panel of five experts from a wide range of institutions was brought together to discuss the added value of simulation-based training for improving systems-based aspects of the perioperative service line. Panelists shared the way in which simulation was demonstrated at their institutions. The themes discussed by each panel member were delineated into four avenues through which simulation-based techniques have been used. Simulation-based techniques are being used in (1) testing new clinical workspaces and facilities before they open to identify potential latent conditions; (2) practicing how to identify the deteriorating patient and escalate care in an effective manner; (3) performing prospective root cause analyses to address system weaknesses leading to sentinel events; and (4) evaluating the efficiency and effectiveness of the electronic health record in the perioperative setting. This focused review of simulation-based interventions to test and improve components of the perioperative microsystem, which includes literature that has emerged since the panel's presentation, highlights the broad-based utility of simulation-based technologies in health care. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  12. Parallel processing for nonlinear dynamics simulations of structures including rotating bladed-disk assemblies

    NASA Technical Reports Server (NTRS)

    Hsieh, Shang-Hsien

    1993-01-01

    The principal objective of this research is to develop, test, and implement coarse-grained, parallel-processing strategies for nonlinear dynamic simulations of practical structural problems. There are contributions to four main areas: finite element modeling and analysis of rotational dynamics, numerical algorithms for parallel nonlinear solutions, automatic partitioning techniques to effect load-balancing among processors, and an integrated parallel analysis system.

  13. Simulating a Measurement of the 2nd Knee in the Cosmic Ray Spectrum with an Atmospheric Fluorescence Telescope Tower Array

    PubMed Central

    Liu, Jiali; Yang, Qunyu; Bai, Yunxiang; Cao, Zhen

    2014-01-01

    A fluorescence telescope tower array has been designed to measure cosmic rays in the energy range of 1017–1018 eV. A full Monte Carlo simulation, including air shower production, light generation and propagation, detector response, electronics, and trigger system, has been developed for that purpose. Using such a simulation tool, the detector configuration, which includes one main tower array and two side-trigger arrays, 24 telescopes in total, has been optimized. The aperture and the event rate have been estimated. Furthermore, the performance of the X max⁡ technique in measuring composition has also been studied. PMID:24737964

  14. A model for including thermal conduction in molecular dynamics simulations

    NASA Technical Reports Server (NTRS)

    Wu, Yue; Friauf, Robert J.

    1989-01-01

    A technique is introduced for including thermal conduction in molecular dynamics simulations for solids. A model is developed to allow energy flow between the computational cell and the bulk of the solid when periodic boundary conditions cannot be used. Thermal conduction is achieved by scaling the velocities of atoms in a transitional boundary layer. The scaling factor is obtained from the thermal diffusivity, and the results show good agreement with the solution for a continuous medium at long times. The effects of different temperature and size of the system, and of variations in strength parameter, atomic mass, and thermal diffusivity were investigated. In all cases, no significant change in simulation results has been found.

  15. EVA manipulation and assembly of space structure columns

    NASA Technical Reports Server (NTRS)

    Loughead, T. E.; Pruett, E. C.

    1980-01-01

    Assembly techniques and hardware configurations used in assembly of the basic tetrahedral cell by A7LB pressure-suited subjects in a neutral bouyancy simulator were studied. Eleven subjects participated in assembly procedures which investigated two types of structural members and two configurations of attachment hardware. The assembly was accomplished through extra-vehicular activity (EVA) only, EVA with simulated manned maneuvering unit (MMU), and EVA with simulated MMU and simulated remote manipulator system (RMS). Assembly times as low as 10.20 minutes per tetrahedron were achieved. Task element data, as well as assembly procedures, are included.

  16. Evaporation kinetics of Mg2SiO4 crystals and melts from molecular dynamics simulations

    NASA Technical Reports Server (NTRS)

    Kubicki, J. D.; Stolper, E. M.

    1993-01-01

    Computer simulations based on the molecular dynamics (MD) technique were used to study the mechanisms and kinetics of free evaporation from crystalline and molten forsterite (i.e., Mg2SiO4) on an atomic level. The interatomic potential employed for these simulations reproduces the energetics of bonding in forsterite and in gas-phase MgO and SiO2 reasonably accurately. Results of the simulation include predicted evaporation rates, diffusion rates, and reaction mechanisms for Mg2SiO4(s or l) yields 2Mg(g) + 20(g) + SiO2(g).

  17. Multiple-access phased array antenna simulator for a digital beam-forming system investigation

    NASA Technical Reports Server (NTRS)

    Kerczewski, Robert J.; Yu, John; Walton, Joanne C.; Perl, Thomas D.; Andro, Monty; Alexovich, Robert E.

    1992-01-01

    Future versions of data relay satellite systems are currently being planned by NASA. Being given consideration for implementation are on-board digital beamforming techniques which will allow multiple users to simultaneously access a single S-band phased array antenna system. To investigate the potential performance of such a system, a laboratory simulator has been developed at NASA's Lewis Research Center. This paper describes the system simulator, and in particular, the requirements, design and performance of a key subsystem, the phased array antenna simulator, which provides realistic inputs to the digital processor including multiple signals, noise, and nonlinearities.

  18. Multiple-access phased array antenna simulator for a digital beam forming system investigation

    NASA Technical Reports Server (NTRS)

    Kerczewski, Robert J.; Yu, John; Walton, Joanne C.; Perl, Thomas D.; Andro, Monty; Alexovich, Robert E.

    1992-01-01

    Future versions of data relay satellite systems are currently being planned by NASA. Being given consideration for implementation are on-board digital beamforming techniques which will allow multiple users to simultaneously access a single S-band phased array antenna system. To investigate the potential performance of such a system, a laboratory simulator has been developed at NASA's Lewis Research Center. This paper describes the system simulator, and in particular, the requirements, design, and performance of a key subsystem, the phased array antenna simulator, which provides realistic inputs to the digital processor including multiple signals, noise, and nonlinearities.

  19. Crash Certification by Analysis - Are We There Yet?

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.

    2006-01-01

    This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."

  20. A New Computational Technique for the Generation of Optimised Aircraft Trajectories

    NASA Astrophysics Data System (ADS)

    Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto

    2017-12-01

    A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.

  1. a Computer Simulation Study of Coherent Optical Fibre Communication Systems

    NASA Astrophysics Data System (ADS)

    Urey, Zafer

    Available from UMI in association with The British Library. A computer simulation study of coherent optical fibre communication systems is presented in this thesis. The Wiener process is proposed as the simulation model of laser phase noise and verified to be a good one. This model is included in the simulation experiments along with the other noise sources (i.e shot noise, thermal noise and laser intensity noise) and the models that represent the various waveform processing blocks in a system such as filtering, demodulation, etc. A novel mixed-semianalytical simulation procedure is designed and successfully applied for the estimation of bit error rates as low as 10^{-10 }. In this technique the noise processes and the ISI effects at the decision time are characterized from simulation experiments but the calculation of the probability of error is obtained by numerically integrating the noise statistics over the error region using analytical expressions. Simulation of only 4096 bits is found to give estimates of BER's corresponding to received optical power within 1 dB of the theoretical calculations using this approach. This number is very small when compared with the pure simulation techniques. Hence, the technique is proved to be very efficient in terms of the computation time and the memory requirements. A command driven simulation software which runs on a DEC VAX computer under the UNIX operating system is written by the author and a series of simulation experiments are carried out using this software. In particular, the effects of IF filtering on the performance of PSK heterodyne receivers with synchronous demodulation are examined when both the phase noise and the shot noise are included in the simulations. The BER curves of this receiver are estimated for the first time for various cases of IF filtering using the mixed-semianalytical approach. At a power penalty of 1 dB the IF linewidth requirement of this receiver with the matched filter is estimated to be less than 650 kHz at the modulation rate of 1 Gbps and BER of 10 ^{-9}. The IF linewidth requirement for other IF filtering cases are also estimated. The results are not found to be much different from the matched filter case. Therefore, it is concluded that IF filtering does not have any effect for the reduction of phase noise in PSK heterodyne systems with synchronous demodulation.

  2. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  3. Operational prediction of rip currents using numerical model and nearshore bathymetry from video images

    NASA Astrophysics Data System (ADS)

    Sembiring, L.; Van Ormondt, M.; Van Dongeren, A. R.; Roelvink, J. A.

    2017-07-01

    Rip currents are one of the most dangerous coastal hazards for swimmers. In order to minimize the risk, a coastal operational-process based-model system can be utilized in order to provide forecast of nearshore waves and currents that may endanger beach goers. In this paper, an operational model for rip current prediction by utilizing nearshore bathymetry obtained from video image technique is demonstrated. For the nearshore scale model, XBeach1 is used with which tidal currents, wave induced currents (including the effect of the wave groups) can be simulated simultaneously. Up-to-date bathymetry will be obtained using video images technique, cBathy 2. The system will be tested for the Egmond aan Zee beach, located in the northern part of the Dutch coastline. This paper will test the applicability of bathymetry obtained from video technique to be used as input for the numerical modelling system by comparing simulation results using surveyed bathymetry and model results using video bathymetry. Results show that the video technique is able to produce bathymetry converging towards the ground truth observations. This bathymetry validation will be followed by an example of operational forecasting type of simulation on predicting rip currents. Rip currents flow fields simulated over measured and modeled bathymetries are compared in order to assess the performance of the proposed forecast system.

  4. The SELGIFS data challenge: generating synthetic observations of CALIFA galaxies from hydrodynamical simulations

    NASA Astrophysics Data System (ADS)

    Guidi, G.; Casado, J.; Ascasibar, Y.; García-Benito, R.; Galbany, L.; Sánchez-Blázquez, P.; Sánchez, S. F.; Rosales-Ortega, F. F.; Scannapieco, C.

    2018-06-01

    In this work we present a set of synthetic observations that mimic the properties of the Integral Field Spectroscopy (IFS) survey CALIFA, generated using radiative transfer techniques applied to hydrodynamical simulations of galaxies in a cosmological context. The simulated spatially-resolved spectra include stellar and nebular emission, kinematic broadening of the lines, and dust extinction and scattering. The results of the radiative transfer simulations have been post-processed to reproduce the main properties of the CALIFA V500 and V1200 observational setups. The data has been further formatted to mimic the CALIFA survey in terms of field of view size, spectral range and sampling. We have included the effect of the spatial and spectral Point Spread Functions affecting CALIFA observations, and added detector noise after characterizing it on a sample of 367 galaxies. The simulated datacubes are suited to be analysed by the same algorithms used on real IFS data. In order to provide a benchmark to compare the results obtained applying IFS observational techniques to our synthetic datacubes, and test the calibration and accuracy of the analysis tools, we have computed the spatially-resolved properties of the simulations. Hence, we provide maps derived directly from the hydrodynamical snapshots or the noiseless spectra, in a way that is consistent with the values recovered by the observational analysis algorithms. Both the synthetic observations and the product datacubes are public and can be found in the collaboration website http://astro.ft.uam.es/selgifs/data_challenge/.

  5. [Development of computer aided forming techniques in manufacturing scaffolds for bone tissue engineering].

    PubMed

    Wei, Xuelei; Dong, Fuhui

    2011-12-01

    To review recent advance in the research and application of computer aided forming techniques for constructing bone tissue engineering scaffolds. The literature concerning computer aided forming techniques for constructing bone tissue engineering scaffolds in recent years was reviewed extensively and summarized. Several studies over last decade have focused on computer aided forming techniques for bone scaffold construction using various scaffold materials, which is based on computer aided design (CAD) and bone scaffold rapid prototyping (RP). CAD include medical CAD, STL, and reverse design. Reverse design can fully simulate normal bone tissue and could be very useful for the CAD. RP techniques include fused deposition modeling, three dimensional printing, selected laser sintering, three dimensional bioplotting, and low-temperature deposition manufacturing. These techniques provide a new way to construct bone tissue engineering scaffolds with complex internal structures. With rapid development of molding and forming techniques, computer aided forming techniques are expected to provide ideal bone tissue engineering scaffolds.

  6. Intensity mapping the Universe

    NASA Astrophysics Data System (ADS)

    Croft, Rupert

    Intensity mapping (IM) is the use of one or more emission lines to trace out the structure of the Universe without needing to resolve individual objects (such as galaxies or gas clouds). It is one of the most promising ways to radically extend the sky survey revolution in cosmology. By making spectra of the entire sky, rather than the one part in one million captured by current fiber spectrographs, one would be sensitive to all structure. There are potentially huge discoveries to be made in the vast majority of the sky that is currently spectrally unmapped, and also great gains in signal to noise of cosmological clustering measurements. Intensity mapping with the 21cm radio line has been explored theoretically by many and instruments are being built, particularly targeting the epoch of reionization. In the UV, visible and infrared, however other lines have enormous promise, and will be exploited by a range of future NASA missions including WFIRST, Euclid, and the proposed SPHEREx instrument, a dedicated intensity mapping satellite. The first measurement of large-scale structure outside the radio (using Lyman-alpha emission) was recently made by the PI and collaborators. The Ly-a absorption line also traces a continuous cosmological field, the Lyman-alpha forest, and the enormous recent increase in the number of observed quasar spectra have made it possible to interpolate between quasar sightlines to create three-dimensional maps. Being able to trace the same cosmic structure in emission and absorption offers huge advantages when we seek to understand the processes involved. It will help us make comprehensive maps of the Universe's contents and offer us the opportunity to create new powerful cosmological tests. In our proposed work we will explore the possibilities afforded by taking grism and integral field spectra of large volumes of the Universe, using state-of-the-art cosmological hydrodynamic simulations. We will make use of analysis techniques developed for the Lyman-alpha forest, as well as forest data itself to test them. Our aim is to develop intensity mapping as a cosmological tool and show how it can be used to answer questions about the contents of the Universe and the formation of structure that are not accessible to traditional techniques. The project will involve both direct sampling of cosmic structure and cross-correlations of line intensity and objects (including galaxies, quasars and absorption lines). Emission (e.g., H-alpha emission) and absorption (Ly alpha forest) will be viewed as continuous fields. Using large volume cosmological simulations combined with population synthesis techniques we will make simulated spectral data sets. The techniques to analyse these cosmological data cubes will be developed. The expected outcomes are the following: (a) Predictions for the large-scale structure of strong emission lines (including Ha, Hb, Lya, OII, OIII) in the Universe using hydrodynamic simulations including the contribution from all components, from quasars to diffuse emssion. (b) Simulations of realistic examples of the use of IM as a cosmological probe, including Baryon Oscillations and weak gravitational lensing. (c) Tests of techniques to detection and quantify the low surface brightness Universe, leading to a complete census of the cosmic intensity in specific lines such as OII and Ha. (d) Development of techniques to extract redshifts for individual galaxies from low angular resolution IM spectroscopy. (e) Mock catalogs for SPHEREx, Euclid and WFIRST spectroscopy of diffuse emission, as well as for the Galex grism survey and tests of analysis techniques on data from the latter.

  7. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy

    NASA Astrophysics Data System (ADS)

    Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.

    2016-12-01

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  8. egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.

    PubMed

    Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M

    2016-12-07

    egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.

  9. Design and evaluation of a DAMQ multiprocessor network with self-compacting buffers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, J.; O`Krafka, B.W.O.; Vassiliadis, S.

    1994-12-31

    This paper describes a new approach to implement Dynamically Allocated Multi-Queue (DAMQ) switching elements using a technique called ``self-compacting buffers``. This technique is efficient in that the amount of hardware required to manage the buffers is relatively small; it offers high performance since it is an implementation of a DAMQ. The first part of this paper describes the self-compacting buffer architecture in detail, and compares it against a competing DAMQ switch design. The second part presents extensive simulation results comparing the performance of a self compacting buffer switch against an ideal switch including several examples of k-ary n-cubes and deltamore » networks. In addition, simulation results show how the performance of an entire network can be quickly and accurately approximated by simulating just a single switching element.« less

  10. Student perception of two different simulation techniques in oral and maxillofacial surgery undergraduate training.

    PubMed

    Lund, Bodil; Fors, Uno; Sejersen, Ronny; Sallnäs, Eva-Lotta; Rosén, Annika

    2011-10-12

    Yearly surveys among the undergraduate students in oral and maxillofacial surgery at Karolinska Institutet have conveyed a wish for increased clinical training, and in particular, in surgical removal of mandibular third molars. Due to lack of resources, this kind of clinical supervision has so far not been possible to implement. One possible solution to this problem might be to introduce simulation into the curriculum. The purpose of this study was to investigate undergraduate students' perception of two different simulation methods for practicing clinical reasoning skills and technical skills in oral and maxillofacial surgery. Forty-seven students participating in the oral and maxillofacial surgery course at Karolinska Institutet during their final year were included. Three different oral surgery patient cases were created in a Virtual Patient (VP) Simulation system (Web-SP) and used for training clinical reasoning. A mandibular third molar surgery simulator with tactile feedback, providing hands on training in the bone removal and tooth sectioning in third molar surgery, was also tested. A seminar was performed using the combination of these two simulators where students' perception of the two different simulation methods was assessed by means of a questionnaire. The response rate was 91.5% (43/47). The students were positive to the VP cases, although they rated their possible improvement of clinical reasoning skills as moderate. The students' perception of improved technical skills after training in the mandibular third molar surgery simulator was rated high. The majority of the students agreed that both simulation techniques should be included in the curriculum and strongly agreed that it was a good idea to use the two simulators in concert. The importance of feedback from the senior experts during simulator training was emphasised. The two tested simulation methods were well accepted and most students agreed that the future curriculum would benefit from permanent inclusion of these exercises, especially when used in combination. The results also stress the importance of teaching technical skills and clinical reasoning in concert.

  11. Operational forest management planning methods: proceedings, meeting of steering systems project group, International Union of Forestry Research Organizations, Bucharest, Romania, June 18-24, 1978

    Treesearch

    Daniel Navon

    1978-01-01

    These 14 papers were submitted to a conference of Project Group P4.07 Division IV, International Union of Forestry Research Organizations. Topics discussed included the uses of simulations, analytical techniques, and mathematical programming techniques in land management planning, reforestation programs, intensive forestry, timber management and production, tree growth...

  12. Algorithmic developments of the kinetic activation-relaxation technique: Accessing long-time kinetics of larger and more complex systems

    NASA Astrophysics Data System (ADS)

    Trochet, Mickaël; Sauvé-Lacoursière, Alecsandre; Mousseau, Normand

    2017-10-01

    In spite of the considerable computer speed increase of the last decades, long-time atomic simulations remain a challenge and most molecular dynamical simulations are limited to 1 μ s at the very best in condensed matter and materials science. There is a need, therefore, for accelerated methods that can bridge the gap between the full dynamical description of molecular dynamics and experimentally relevant time scales. This is the goal of the kinetic Activation-Relaxation Technique (k-ART), an off-lattice kinetic Monte-Carlo method with on-the-fly catalog building capabilities based on the topological tool NAUTY and the open-ended search method Activation-Relaxation Technique (ART nouveau) that has been applied with success to the study of long-time kinetics of complex materials, including grain boundaries, alloys, and amorphous materials. We present a number of recent algorithmic additions, including the use of local force calculation, two-level parallelization, improved topological description, and biased sampling and show how they perform on two applications linked to defect diffusion and relaxation after ion bombardement in Si.

  13. Evaluation of gravimetric techniques to estimate the microvascular filtration coefficient

    PubMed Central

    Dongaonkar, R. M.; Laine, G. A.; Stewart, R. H.

    2011-01-01

    Microvascular permeability to water is characterized by the microvascular filtration coefficient (Kf). Conventional gravimetric techniques to estimate Kf rely on data obtained from either transient or steady-state increases in organ weight in response to increases in microvascular pressure. Both techniques result in considerably different estimates and neither account for interstitial fluid storage and lymphatic return. We therefore developed a theoretical framework to evaluate Kf estimation techniques by 1) comparing conventional techniques to a novel technique that includes effects of interstitial fluid storage and lymphatic return, 2) evaluating the ability of conventional techniques to reproduce Kf from simulated gravimetric data generated by a realistic interstitial fluid balance model, 3) analyzing new data collected from rat intestine, and 4) analyzing previously reported data. These approaches revealed that the steady-state gravimetric technique yields estimates that are not directly related to Kf and are in some cases directly proportional to interstitial compliance. However, the transient gravimetric technique yields accurate estimates in some organs, because the typical experimental duration minimizes the effects of interstitial fluid storage and lymphatic return. Furthermore, our analytical framework reveals that the supposed requirement of tying off all draining lymphatic vessels for the transient technique is unnecessary. Finally, our numerical simulations indicate that our comprehensive technique accurately reproduces the value of Kf in all organs, is not confounded by interstitial storage and lymphatic return, and provides corroboration of the estimate from the transient technique. PMID:21346245

  14. Effects of including surface depressions in the application of the Precipitation-Runoff Modeling System in the Upper Flint River Basin, Georgia

    USGS Publications Warehouse

    Viger, Roland J.; Hay, Lauren E.; Jones, John W.; Buell, Gary R.

    2010-01-01

    This report documents an extension of the Precipitation Runoff Modeling System that accounts for the effect of a large number of water-holding depressions in the land surface on the hydrologic response of a basin. Several techniques for developing the inputs needed by this extension also are presented. These techniques include the delineation of the surface depressions, the generation of volume estimates for the surface depressions, and the derivation of model parameters required to describe these surface depressions. This extension is valuable for applications in basins where surface depressions are too small or numerous to conveniently model as discrete spatial units, but where the aggregated storage capacity of these units is large enough to have a substantial effect on streamflow. In addition, this report documents several new model concepts that were evaluated in conjunction with the depression storage functionality, including: ?hydrologically effective? imperviousness, rates of hydraulic conductivity, and daily streamflow routing. All of these techniques are demonstrated as part of an application in the Upper Flint River Basin, Georgia. Simulated solar radiation, potential evapotranspiration, and water balances match observations well, with small errors for the first two simulated data in June and August because of differences in temperatures from the calibration and evaluation periods for those months. Daily runoff simulations show increasing accuracy with streamflow and a good fit overall. Including surface depression storage in the model has the effect of decreasing daily streamflow for all but the lowest flow values. The report discusses the choices and resultant effects involved in delineating and parameterizing these features. The remaining enhancements to the model and its application provide a more realistic description of basin geography and hydrology that serve to constrain the calibration process to more physically realistic parameter values.

  15. Petascale Kinetic Simulations in Space Sciences: New Simulations and Data Discovery Techniques and Physics Results

    NASA Astrophysics Data System (ADS)

    Karimabadi, Homa

    2012-03-01

    Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.

  16. Measurements of Deposition, Lung Surface Area and Lung Fluid for Simulation of Inhaled Compounds.

    PubMed

    Fröhlich, Eleonore; Mercuri, Annalisa; Wu, Shengqian; Salar-Behzadi, Sharareh

    2016-01-01

    Modern strategies in drug development employ in silico techniques in the design of compounds as well as estimations of pharmacokinetics, pharmacodynamics and toxicity parameters. The quality of the results depends on software algorithm, data library and input data. Compared to simulations of absorption, distribution, metabolism, excretion, and toxicity of oral drug compounds, relatively few studies report predictions of pharmacokinetics and pharmacodynamics of inhaled substances. For calculation of the drug concentration at the absorption site, the pulmonary epithelium, physiological parameters such as lung surface and distribution volume (lung lining fluid) have to be known. These parameters can only be determined by invasive techniques and by postmortem studies. Very different values have been reported in the literature. This review addresses the state of software programs for simulation of orally inhaled substances and focuses on problems in the determination of particle deposition, lung surface and of lung lining fluid. The different surface areas for deposition and for drug absorption are difficult to include directly into the simulations. As drug levels are influenced by multiple parameters the role of single parameters in the simulations cannot be identified easily.

  17. Studies of the Low-energy Gamma Background

    NASA Astrophysics Data System (ADS)

    Bikit, K.; Mrđa, D.; Bikit, I.; Slivka, J.; Veskovic, M.; Knezevic, D.

    The investigations of contribution to the low-energy part of background gamma spectrum (below 100 keV) and knowing detection efficiency for this region are important for both, a fundamental, as well as for applied research. In this work, the components contributing to the low-energy region of background gamma spectrum for shielded detector are analyzed, including the production and spectral distribution of muon-induced continuous low-energy radiation in the vicinity of high-purity germanium detector.In addition, the detection efficiency for low energy gamma region is determined using the GEANT 4 simulation package. This technique offers excellent opportunity to predict the detection response in mentioned region. Unfortunately, the frequently weakly known dead layer thickness on the surface of the extended-range detector, as well as some processes which are not incorporated in simulation (e.g. charge collection from detector active volume) may limit the reliability of simulation technique. Thus, the 14, 17, 21, 26, 33, 59.5 keV transitions in the calibrated 241Am point source were used to check the simulated efficiencies.

  18. Demonstrating Newton's Third Law: Changing Aristotelian Viewpoints.

    ERIC Educational Resources Information Center

    Roach, Linda E.

    1992-01-01

    Suggests techniques to help eliminate students' misconceptions involving Newton's Third Law. Approaches suggested include teaching physics from a historical perspective, using computer programs with simulations, rewording the law, drawing free-body diagrams, and using demonstrations and examples. (PR)

  19. Retrieval of Precipitation Profiles from Multiresolution, Multifrequency, Active and Passive Microwave Observations

    NASA Technical Reports Server (NTRS)

    Grecu, Mircea; Anagnostou, Emmanouil N.; Olson, William S.; Starr, David OC. (Technical Monitor)

    2002-01-01

    In this study, a technique for estimating vertical profiles of precipitation from multifrequency, multiresolution active and passive microwave observations is investigated using both simulated and airborne data. The technique is applicable to the Tropical Rainfall Measuring Mission (TRMM) satellite multi-frequency active and passive observations. These observations are characterized by various spatial and sampling resolutions. This makes the retrieval problem mathematically more difficult and ill-determined because the quality of information decreases with decreasing resolution. A model that, given reflectivity profiles and a small set of parameters (including the cloud water content, the intercept drop size distribution, and a variable describing the frozen hydrometeor properties), simulates high-resolution brightness temperatures is used. The high-resolution simulated brightness temperatures are convolved at the real sensor resolution. An optimal estimation procedure is used to minimize the differences between simulated and observed brightness temperatures. The retrieval technique is investigated using cloud model synthetic and airborne data from the Fourth Convection And Moisture Experiment. Simulated high-resolution brightness temperatures and reflectivities and airborne observation strong are convolved at the resolution of the TRMM instruments and retrievals are performed and analyzed relative to the reference data used in observations synthesis. An illustration of the possible use of the technique in satellite rainfall estimation is presented through an application to TRMM data. The study suggests improvements in combined active and passive retrievals even when the instruments resolutions are significantly different. Future work needs to better quantify the retrievals performance, especially in connection with satellite applications, and the uncertainty of the models used in retrieval.

  20. Design of a digital voice data compression technique for orbiter voice channels

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Candidate techniques were investigated for digital voice compression to a transmission rate of 8 kbps. Good voice quality, speaker recognition, and robustness in the presence of error bursts were considered. The technique of delayed-decision adaptive predictive coding is described and compared with conventional adaptive predictive coding. Results include a set of experimental simulations recorded on analog tape. The two FM broadcast segments produced show the delayed-decision technique to be virtually undegraded or minimally degraded at .001 and .01 Viterbi decoder bit error rates. Preliminary estimates of the hardware complexity of this technique indicate potential for implementation in space shuttle orbiters.

  1. The formation of disc galaxies in high-resolution moving-mesh cosmological simulations

    NASA Astrophysics Data System (ADS)

    Marinacci, Federico; Pakmor, Rüdiger; Springel, Volker

    2014-01-01

    We present cosmological hydrodynamical simulations of eight Milky Way-sized haloes that have been previously studied with dark matter only in the Aquarius project. For the first time, we employ the moving-mesh code AREPO in zoom simulations combined with a comprehensive model for galaxy formation physics designed for large cosmological simulations. Our simulations form in most of the eight haloes strongly disc-dominated systems with realistic rotation curves, close to exponential surface density profiles, a stellar mass to halo mass ratio that matches expectations from abundance matching techniques, and galaxy sizes and ages consistent with expectations from large galaxy surveys in the local Universe. There is no evidence for any dark matter core formation in our simulations, even so they include repeated baryonic outflows by supernova-driven winds and black hole quasar feedback. For one of our haloes, the object studied in the recent `Aquila' code comparison project, we carried out a resolution study with our techniques, covering a dynamic range of 64 in mass resolution. Without any change in our feedback parameters, the final galaxy properties are reassuringly similar, in contrast to other modelling techniques used in the field that are inherently resolution dependent. This success in producing realistic disc galaxies is reached, in the context of our interstellar medium treatment, without resorting to a high density threshold for star formation, a low star formation efficiency, or early stellar feedback, factors deemed crucial for disc formation by other recent numerical studies.

  2. Modeling the effects of forest management on in situ and ex situ longleaf pine forest carbon stocks

    Treesearch

    C.A. Gonzalez-Benecke; L.J. Samuelson; T.A. Martin; W.P. Cropper Jr; Kurt Johnsen; T.A. Stokes; John Butnor; P.H. Anderson

    2015-01-01

    Assessment of forest carbon storage dynamics requires a variety of techniques including simulation models. We developed a hybrid model to assess the effects of silvicultural management systems on carbon (C) budgets in longleaf pine (Pinus palustris Mill.) plantations in the southeastern U.S. To simulate in situ C pools, the model integrates a growth and yield model...

  3. Status of the Electroforming Shield Design (ESD) project

    NASA Technical Reports Server (NTRS)

    Fletcher, R. E.

    1977-01-01

    The utilization of a digital computer to augment electrodeposition/electroforming processes in which nonconducting shielding controls local cathodic current distribution is reported. The primary underlying philosophy of the physics of electrodeposition was presented. The technical approach taken to analytically simulate electrolytic tank variables was also included. A FORTRAN computer program has been developed and implemented. The program utilized finite element techniques and electrostatic theory to simulate electropotential fields and ionic transport.

  4. Asphalt pavement aging and temperature dependent properties using functionally graded viscoelastic model

    NASA Astrophysics Data System (ADS)

    Dave, Eshan V.

    Asphalt concrete pavements are inherently graded viscoelastic structures. Oxidative aging of asphalt binder and temperature cycling due to climatic conditions being the major cause of non-homogeneity. Current pavement analysis and simulation procedures dwell on the use of layered approach to account for these non-homogeneities. The conventional finite-element modeling (FEM) technique discretizes the problem domain into smaller elements, each with a unique constitutive property. However the assignment of unique material property description to an element in the FEM approach makes it an unattractive choice for simulation of problems with material non-homogeneities. Specialized elements such as "graded elements" allow for non-homogenous material property definitions within an element. This dissertation describes the development of graded viscoelastic finite element analysis method and its application for analysis of asphalt concrete pavements. Results show that the present research improves efficiency and accuracy of simulations for asphalt pavement systems. Some of the practical implications of this work include the new technique's capability for accurate analysis and design of asphalt pavements and overlay systems and for the determination of pavement performance with varying climatic conditions and amount of in-service age. Other application areas include simulation of functionally graded fiber-reinforced concrete, geotechnical materials, metal and metal composites at high temperatures, polymers, and several other naturally existing and engineered materials.

  5. Simulation of realistic retinoscopic measurement

    NASA Astrophysics Data System (ADS)

    Tan, Bo; Chen, Ying-Ling; Baker, K.; Lewis, J. W.; Swartz, T.; Jiang, Y.; Wang, M.

    2007-03-01

    Realistic simulation of ophthalmic measurements on normal and diseased eyes is presented. We use clinical data of ametropic and keratoconus patients to construct anatomically accurate three-dimensional eye models and simulate the measurement of a streak retinoscope with all the optical elements. The results show the clinical observations including the anomalous motion in high myopia and the scissors reflex in keratoconus. The demonstrated technique can be applied to other ophthalmic instruments and to other and more extensively abnormal eye conditions. It provides promising features for medical training and for evaluating and developing ocular instruments.

  6. Modeling of turbulent separated flows for aerodynamic applications

    NASA Technical Reports Server (NTRS)

    Marvin, J. G.

    1983-01-01

    Steady, high speed, compressible separated flows modeled through numerical simulations resulting from solutions of the mass-averaged Navier-Stokes equations are reviewed. Emphasis is placed on benchmark flows that represent simplified (but realistic) aerodynamic phenomena. These include impinging shock waves, compression corners, glancing shock waves, trailing edge regions, and supersonic high angle of attack flows. A critical assessment of modeling capabilities is provided by comparing the numerical simulations with experiment. The importance of combining experiment, numerical algorithm, grid, and turbulence model to effectively develop this potentially powerful simulation technique is stressed.

  7. The statistical significance of error probability as determined from decoding simulations for long codes

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  8. Stochastic Effects in Computational Biology of Space Radiation Cancer Risk

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Pluth, Janis; Harper, Jane; O'Neill, Peter

    2007-01-01

    Estimating risk from space radiation poses important questions on the radiobiology of protons and heavy ions. We are considering systems biology models to study radiation induced repair foci (RIRF) at low doses, in which less than one-track on average transverses the cell, and the subsequent DNA damage processing and signal transduction events. Computational approaches for describing protein regulatory networks coupled to DNA and oxidative damage sites include systems of differential equations, stochastic equations, and Monte-Carlo simulations. We review recent developments in the mathematical description of protein regulatory networks and possible approaches to radiation effects simulation. These include robustness, which states that regulatory networks maintain their functions against external and internal perturbations due to compensating properties of redundancy and molecular feedback controls, and modularity, which leads to general theorems for considering molecules that interact through a regulatory mechanism without exchange of matter leading to a block diagonal reduction of the connecting pathways. Identifying rate-limiting steps, robustness, and modularity in pathways perturbed by radiation damage are shown to be valid techniques for reducing large molecular systems to realistic computer simulations. Other techniques studied are the use of steady-state analysis, and the introduction of composite molecules or rate-constants to represent small collections of reactants. Applications of these techniques to describe spatial and temporal distributions of RIRF and cell populations following low dose irradiation are described.

  9. N-S/DSMC hybrid simulation of hypersonic flow over blunt body including wakes

    NASA Astrophysics Data System (ADS)

    Li, Zhonghua; Li, Zhihui; Li, Haiyan; Yang, Yanguang; Jiang, Xinyu

    2014-12-01

    A hybrid N-S/DSMC method is presented and applied to solve the three-dimensional hypersonic transitional flows by employing the MPC (modular Particle-Continuum) technique based on the N-S and the DSMC method. A sub-relax technique is adopted to deal with information transfer between the N-S and the DSMC. The hypersonic flows over a 70-deg spherically blunted cone under different Kn numbers are simulated using the CFD, DSMC and hybrid N-S/DSMC method. The present computations are found in good agreement with DSMC and experimental results. The present method provides an efficient way to predict the hypersonic aerodynamics in near-continuum transitional flow regime.

  10. Simulation of wind turbine wakes using the actuator line technique.

    PubMed

    Sørensen, Jens N; Mikkelsen, Robert F; Henningson, Dan S; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J

    2015-02-28

    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  11. Discrete-time modelling of musical instruments

    NASA Astrophysics Data System (ADS)

    Välimäki, Vesa; Pakarinen, Jyri; Erkut, Cumhur; Karjalainen, Matti

    2006-01-01

    This article describes physical modelling techniques that can be used for simulating musical instruments. The methods are closely related to digital signal processing. They discretize the system with respect to time, because the aim is to run the simulation using a computer. The physics-based modelling methods can be classified as mass-spring, modal, wave digital, finite difference, digital waveguide and source-filter models. We present the basic theory and a discussion on possible extensions for each modelling technique. For some methods, a simple model example is chosen from the existing literature demonstrating a typical use of the method. For instance, in the case of the digital waveguide modelling technique a vibrating string model is discussed, and in the case of the wave digital filter technique we present a classical piano hammer model. We tackle some nonlinear and time-varying models and include new results on the digital waveguide modelling of a nonlinear string. Current trends and future directions in physical modelling of musical instruments are discussed.

  12. Land Surface Data Assimilation and the Northern Gulf Coast Land/Sea Breeze

    NASA Technical Reports Server (NTRS)

    Lapenta, William M.; Blackwell, Keith; Suggs, Ron; McNider, Richard T.; Jedlovec, Gary; Kimball, Sytske; Arnold, James E. (Technical Monitor)

    2002-01-01

    A technique has been developed for assimilating GOES-derived skin temperature tendencies and insolation into the surface energy budget equation of a mesoscale model so that the simulated rate of temperature change closely agrees with the satellite observations. A critical assumption of the technique is that the availability of moisture (either from the soil or vegetation) is the least known term in the model's surface energy budget. Therefore, the simulated latent heat flux, which is a function of surface moisture availability, is adjusted based upon differences between the modeled and satellite observed skin temperature tendencies. An advantage of this technique is that satellite temperature tendencies are assimilated in an energetically consistent manner that avoids energy imbalances and surface stability problems that arise from direct assimilation of surface shelter temperatures. The fact that the rate of change of the satellite skin temperature is used rather than the absolute temperature means that sensor calibration is not as critical. The sea/land breeze is a well-documented mesoscale circulation that affects many coastal areas of the world including the northern Gulf Coast of the United States. The focus of this paper is to examine how the satellite assimilation technique impacts the simulation of a sea breeze circulation observed along the Mississippi/Alabama coast in the spring of 2001. The technique is implemented within the PSU/NCAR MM5 V3-4 and applied on a 4-km domain for this particular application. It is recognized that a 4-km grid spacing is too coarse to explicitly resolve the detailed, mesoscale structure of sea breezes. Nevertheless, the model can forecast certain characteristics of the observed sea breeze including a thermally direct circulation that results from differential low-level heating across the land-sea interface. Our intent is to determine the sensitivity of the circulation to the differential land surface forcing produced via the assimilation of GOES skin temperature tendencies. Results will be quantified through statistical verification techniques.

  13. Determination of the transmission coefficients for quantum structures using FDTD method.

    PubMed

    Peng, Yangyang; Wang, Xiaoying; Sui, Wenquan

    2011-12-01

    The purpose of this work is to develop a simple method to incorporate quantum effect in traditional finite-difference time-domain (FDTD) simulators. Witch could make it possible to co-simulate systems include quantum structures and traditional components. In this paper, tunneling transmission coefficient is calculated by solving time-domain Schrödinger equation with a developed FDTD technique, called FDTD-S method. To validate the feasibility of the method, a simple resonant tunneling diode (RTD) structure model has been simulated using the proposed method. The good agreement between the numerical and analytical results proves its accuracy. The effectness and accuracy of this approach makes it a potential method for analysis and design of hybrid systems includes quantum structures and traditional components.

  14. Space Simulation, 7th. [facilities and testing techniques

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Space simulation facilities and techniques are outlined that encompass thermal scale modeling, computerized simulations, reentry materials, spacecraft contamination, solar simulation, vacuum tests, and heat transfer studies.

  15. Interpretative Communities in Conflict: A Master Syllabus for Political Communication.

    ERIC Educational Resources Information Center

    Smith, Craig Allen

    1992-01-01

    Advocates the interpretive communities approach to teaching political communication. Discusses philosophical issues in the teaching of political communication courses, and pedagogical techniques (including concepts versus cases, clustering examples, C-SPAN video examples, and simulations and games). (SR)

  16. The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)

    NASA Astrophysics Data System (ADS)

    Moradi, I.; Prive, N.; McCarty, W.; Errico, R. M.; Gelaro, R.

    2017-12-01

    This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.

  17. The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)

    NASA Technical Reports Server (NTRS)

    Moradi, Isaac; Prive, Nikki; McCarty, Will; Errico, Ronald M.; Gelaro, Ron

    2017-01-01

    This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.

  18. Hawaiian Electric Advanced Inverter Test Plan - Result Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoke, Anderson; Nelson, Austin; Prabakar, Kumaraguru

    This presentation is intended to share the results of lab testing of five PV inverters with the Hawaiian Electric Companies and other stakeholders and interested parties. The tests included baseline testing of advanced inverter grid support functions, as well as distribution circuit-level tests to examine the impact of the PV inverters on simulated distribution feeders using power hardware-in-the-loop (PHIL) techniques. hardware-in-the-loop (PHIL) techniques.

  19. Fluid Structure Interaction Techniques For Extrusion And Mixing Processes

    NASA Astrophysics Data System (ADS)

    Valette, Rudy; Vergnes, Bruno; Coupez, Thierry

    2007-05-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each sub-domain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique background computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  20. Study of ceramic products and processing techniques in space. [using computerized simulation

    NASA Technical Reports Server (NTRS)

    Markworth, A. J.; Oldfield, W.

    1974-01-01

    An analysis of the solidification kinetics of beta alumina in a zero-gravity environment was carried out, using computer-simulation techniques, in order to assess the feasibility of producing high-quality single crystals of this material in space. The two coupled transport processes included were movement of the solid-liquid interface and diffusion of sodium atoms in the melt. Results of the simulation indicate that appreciable crystal-growth rates can be attained in space. Considerations were also made of the advantages offered by high-quality single crystals of beta alumina for use as a solid electrolyte; these clearly indicate that space-grown materials are superior in many respects to analogous terrestrially-grown crystals. Likewise, economic considerations, based on the rapidly expanding technological applications for beta alumina and related fast ionic conductors, reveal that the many superior qualities of space-grown material justify the added expense and experimental detail associated with space processing.

  1. Resonant-type MEMS transducers excited by two acoustic emission simulation techniques

    NASA Astrophysics Data System (ADS)

    Ozevin, Didem; Greve, David W.; Oppenheim, Irving J.; Pessiki, Stephen

    2004-07-01

    Acoustic emission testing is a passive nondestructive testing technique used to identify the onset and characteristics of damage through the detection and analysis of transient stress waves. Successful detection and implementation of acoustic emission requires good coupling, high transducer sensitivity and ability to discriminate noise from real signals. We report here detection of simulated acoustic emission signals using a MEMS chip fabricated in the multi-user polysilicon surface micromachining (MUMPs) process. The chip includes 18 different transducers with 10 different resonant frequencies in the range of 100 kHz to 1 MHz. It was excited by two different source simulation techniques; pencil lead break and impact loading. The former simulation was accomplished by breaking 0.5 mm lead on the ceramic package. Four transducer outputs were collected simultaneously using a multi-channel oscilloscope. The impact loading was repeated for five different diameter ball bearings. Traditional acoustic emission waveform analysis methods were applied to both data sets to illustrate the identification of different source mechanisms. In addition, a sliding window Fourier transform was performed to differentiate frequencies in time-frequency-amplitude domain. The arrival and energy contents of each resonant frequency were investigated in time-magnitude plots. The advantages of the simultaneous excitation of resonant transducers on one chip are discussed and compared with broadband acoustic emission transducers.

  2. Evaluation of nursing students' work technique after proficiency training in patient transfer methods during undergraduate education.

    PubMed

    Johnsson, A Christina E; Kjellberg, Anders; Lagerström, Monica I

    2006-05-01

    The aim of this study was to investigate if nursing students improved their work technique when assisting a simulated patient from bed to wheelchair after proficiency training, and to investigate whether there was a correlation between the nursing students' work technique and the simulated patients' perceptions of the transfer. 71 students participated in the study, 35 in the intervention group and 36 in the comparison group. The students assisted a simulated patient to move from a bed to a wheelchair. In the intervention group the students made one transfer before and one after training, and in the comparison group they made two transfers before training. Six variables were evaluated: work technique score; nursing students' ratings of comfort, work technique and exertion, and the simulated patients' perceptions of comfort and safety during the transfer. The result showed that nursing students improved their work technique, and that there was a correlation between the work technique and the simulated patients' subjective ratings of the transfer. In conclusion, nursing students improved their work technique after training in patient transfer methods, and the work technique affected the simulated patients' perceptions of the transfer.

  3. Protein Dynamics from NMR and Computer Simulation

    NASA Astrophysics Data System (ADS)

    Wu, Qiong; Kravchenko, Olga; Kemple, Marvin; Likic, Vladimir; Klimtchuk, Elena; Prendergast, Franklyn

    2002-03-01

    Proteins exhibit internal motions from the millisecond to sub-nanosecond time scale. The challenge is to relate these internal motions to biological function. A strategy to address this aim is to apply a combination of several techniques including high-resolution NMR, computer simulation of molecular dynamics (MD), molecular graphics, and finally molecular biology, the latter to generate appropriate samples. Two difficulties that arise are: (1) the time scale which is most directly biologically relevant (ms to μs) is not readily accessible by these techniques and (2) the techniques focus on local and not collective motions. We will outline methods using ^13C-NMR to help alleviate the second problem, as applied to intestinal fatty acid binding protein, a relatively small intracellular protein believed to be involved in fatty acid transport and metabolism. This work is supported in part by PHS Grant GM34847 (FGP) and by a fellowship from the American Heart Association (QW).

  4. System Identification for the Clipper Liberty C96 Wind Turbine

    NASA Astrophysics Data System (ADS)

    Showers, Daniel

    System identification techniques are powerful tools that help improve modeling capabilities of real world dynamic systems. These techniques are well established and have been successfully used on countless systems in many areas. However, wind turbines provide a unique challenge for system identification because of the difficulty in measuring its primary input: wind. This thesis first motivates the problem by demonstrating the challenges with wind turbine system identification using both simulations and real data. It then suggests techniques toward successfully identifying a dynamic wind turbine model including the notion of an effective wind speed and how it might be measured. Various levels of simulation complexity are explored for insights into calculating an effective wind speed. In addition, measurements taken from the University of Minnesota's Clipper Liberty C96 research wind turbine are used for a preliminary investigation into the effective wind speed calculation and system identification of a real world wind turbine.

  5. Estimation of Soil Moisture with L-band Multi-polarization Radar

    NASA Technical Reports Server (NTRS)

    Shi, J.; Chen, K. S.; Kim, Chung-Li Y.; Van Zyl, J. J.; Njoku, E.; Sun, G.; O'Neill, P.; Jackson, T.; Entekhabi, D.

    2004-01-01

    Through analyses of the model simulated data-base, we developed a technique to estimate surface soil moisture under HYDROS radar sensor (L-band multi-polarizations and 40deg incidence) configuration. This technique includes two steps. First, it decomposes the total backscattering signals into two components - the surface scattering components (the bare surface backscattering signals attenuated by the overlaying vegetation layer) and the sum of the direct volume scattering components and surface-volume interaction components at different polarizations. From the model simulated data-base, our decomposition technique works quit well in estimation of the surface scattering components with RMSEs of 0.12,0.25, and 0.55 dB for VV, HH, and VH polarizations, respectively. Then, we use the decomposed surface backscattering signals to estimate the soil moisture and the combined surface roughness and vegetation attenuation correction factors with all three polarizations.

  6. HST3D; a computer code for simulation of heat and solute transport in three-dimensional ground-water flow systems

    USGS Publications Warehouse

    Kipp, K.L.

    1987-01-01

    The Heat- and Soil-Transport Program (HST3D) simulates groundwater flow and associated heat and solute transport in three dimensions. The three governing equations are coupled through the interstitial pore velocity, the dependence of the fluid density on pressure, temperature, the solute-mass fraction , and the dependence of the fluid viscosity on temperature and solute-mass fraction. The solute transport equation is for only a single, solute species with possible linear equilibrium sorption and linear decay. Finite difference techniques are used to discretize the governing equations using a point-distributed grid. The flow-, heat- and solute-transport equations are solved , in turn, after a particle Gauss-reduction scheme is used to modify them. The modified equations are more tightly coupled and have better stability for the numerical solutions. The basic source-sink term represents wells. A complex well flow model may be used to simulate specified flow rate and pressure conditions at the land surface or within the aquifer, with or without pressure and flow rate constraints. Boundary condition types offered include specified value, specified flux, leakage, heat conduction, and approximate free surface, and two types of aquifer influence functions. All boundary conditions can be functions of time. Two techniques are available for solution of the finite difference matrix equations. One technique is a direct-elimination solver, using equations reordered by alternating diagonal planes. The other technique is an iterative solver, using two-line successive over-relaxation. A restart option is available for storing intermediate results and restarting the simulation at an intermediate time with modified boundary conditions. This feature also can be used as protection against computer system failure. Data input and output may be in metric (SI) units or inch-pound units. Output may include tables of dependent variables and parameters, zoned-contour maps, and plots of the dependent variables versus time. (Lantz-PTT)

  7. Mabs monograph air blast instrumentation, 1943 - 1993. Measurement techniques and instrumentation. Volume 3. Air blast structural target and gage calibration. Technical report, 17 September 1993-31 May 1994, FLD04

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisler, R.E.; Keefer, J.H.; Ethridge, N.H.

    1995-08-01

    Structural response measurement techniques and instrumentation developed by Military Applications of Blast Simulators (MABS) participating countries for field tests over the period 1943 through 1993 are summarized. Electronic and non-electronic devices deployed on multi-ton nuclear and high-explosive events are presented with calibration techniques. The country and the year the gage was introduced are included with the description. References for each are also provided.

  8. Calibrating the ChemCam LIBS for Carbonate Minerals on Mars

    DOE R&D Accomplishments Database

    Wiens, Roger C.; Clegg, Samuel M.; Ollila, Ann M.; Barefield, James E.; Lanza, Nina; Newsom, Horton E.

    2009-01-01

    The ChemCam instrument suite on board the NASA Mars Science Laboratory (MSL) rover includes the first LIBS instrument for extraterrestrial applications. Here we examine carbonate minerals in a simulated martian environment using the LIDS technique in order to better understand the in situ signature of these materials on Mars. Both chemical composition and rock type are determined using multivariate analysis (MVA) techniques. Composition is confirmed using scanning electron microscopy (SEM) techniques. Our initial results suggest that ChemCam can recognize and differentiate between carbonate materials on Mars.

  9. Error coding simulations in C

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1994-01-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  10. Effects of interactive instructional techniques in a web-based peripheral nervous system component for human anatomy.

    PubMed

    Allen, Edwin B; Walls, Richard T; Reilly, Frank D

    2008-02-01

    This study investigated the effects of interactive instructional techniques in a web-based peripheral nervous system (PNS) component of a first year medical school human anatomy course. Existing data from 9 years of instruction involving 856 students were used to determine (1) the effect of web-based interactive instructional techniques on written exam item performance and (2) differences between student opinions of the benefit level of five different types of interactive learning objects used. The interactive learning objects included Patient Case studies, review Games, Simulated Interactive Patients (SIP), Flashcards, and unit Quizzes. Exam item analysis scores were found to be significantly higher (p < 0.05) for students receiving the instructional treatment incorporating the web-based interactive learning objects than for students not receiving this treatment. Questionnaires using a five-point Likert scale were analysed to determine student opinion ratings of the interactive learning objects. Students reported favorably on the benefit level of all learning objects. Students rated the benefit level of the Simulated Interactive Patients (SIP) highest, and this rating was significantly higher (p < 0.05) than all other learning objects. This study suggests that web-based interactive instructional techniques improve student exam performance. Students indicated a strong acceptance of Simulated Interactive Patient learning objects.

  11. 3D Reconstruction of Chick Embryo Vascular Geometries Using Non-invasive High-Frequency Ultrasound for Computational Fluid Dynamics Studies.

    PubMed

    Tan, Germaine Xin Yi; Jamil, Muhammad; Tee, Nicole Gui Zhen; Zhong, Liang; Yap, Choon Hwai

    2015-11-01

    Recent animal studies have provided evidence that prenatal blood flow fluid mechanics may play a role in the pathogenesis of congenital cardiovascular malformations. To further these researches, it is important to have an imaging technique for small animal embryos with sufficient resolution to support computational fluid dynamics studies, and that is also non-invasive and non-destructive to allow for subject-specific, longitudinal studies. In the current study, we developed such a technique, based on ultrasound biomicroscopy scans on chick embryos. Our technique included a motion cancelation algorithm to negate embryonic body motion, a temporal averaging algorithm to differentiate blood spaces from tissue spaces, and 3D reconstruction of blood volumes in the embryo. The accuracy of the reconstructed models was validated with direct stereoscopic measurements. A computational fluid dynamics simulation was performed to model fluid flow in the generated construct of a Hamburger-Hamilton (HH) stage 27 embryo. Simulation results showed that there were divergent streamlines and a low shear region at the carotid duct, which may be linked to the carotid duct's eventual regression and disappearance by HH stage 34. We show that our technique has sufficient resolution to produce accurate geometries for computational fluid dynamics simulations to quantify embryonic cardiovascular fluid mechanics.

  12. Interactive Videodisc Learning Systems.

    ERIC Educational Resources Information Center

    Currier, Richard L.

    1983-01-01

    Discussion of capabilities of interactive videodisc, which combines video images recorded on disc and random-access, highlights interactivity; teaching techniques with videodiscs (including masking, disassembly, movie maps, tactical maps, action code, and simulation); costs; and games. Illustrative material is provided. (High Technology, P. O. Box…

  13. The role of simulation in the development of technical competence during surgical training: a literature review

    PubMed Central

    2013-01-01

    Objectives To establish the current state of knowledge on the effect of surgical simulation on the development of technical competence during surgical training. Methods Using a defined search strategy, the medical and educational literature was searched to identify empirical research that uses simulation as an educational intervention with surgical trainees. Included studies were analysed according to guidelines adapted from a Best Evidence in Medical Education review. Results A total of 32 studies were analysed, across 5 main categories of surgical simulation technique - use of bench models and box trainers (9 studies); Virtual Reality (14 studies); human cadavers (4 studies); animal models (2 studies) and robotics (3 studies). An improvement in technical skill was seen within the simulated environment across all five categories. This improvement was seen to transfer to the real patient in the operating room in all categories except the use of animals. Conclusions Based on current evidence, surgical trainees should be confident in the effects of using simulation, and should have access to formal, structured simulation as part of their training. Surgical simulation should incorporate the use of bench models and box trainers, with the use of Virtual Reality where resources allow. Alternatives to cadaveric and animal models should be considered due to the ethical and moral issues surrounding their use, and due to their equivalency with other simulation techniques. However, any use of surgical simulation must be tailored to the individual needs of trainees, and should be accompanied by feedback from expert tutors.

  14. Adaptive Core Simulation Employing Discrete Inverse Theory - Part II: Numerical Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Khalik, Hany S.; Turinsky, Paul J.

    2005-07-15

    Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. The companion paper, ''Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory,'' describes in detail the theoretical background of the proposed adaptive techniques. This paper, Part II, demonstrates several computational experiments conducted to assess the fidelity and robustness of the proposed techniques. The intentmore » is to check the ability of the adapted core simulator model to predict future core observables that are not included in the adaption or core observables that are recorded at core conditions that differ from those at which adaption is completed. Also, this paper demonstrates successful utilization of an efficient sensitivity analysis approach to calculate the sensitivity information required to perform the adaption for millions of input core parameters. Finally, this paper illustrates a useful application for adaptive simulation - reducing the inconsistencies between two different core simulator code systems, where the multitudes of input data to one code are adjusted to enhance the agreement between both codes for important core attributes, i.e., core reactivity and power distribution. Also demonstrated is the robustness of such an application.« less

  15. Simulation verification techniques study: Simulation self test hardware design and techniques report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.

  16. Impact of Simulation Technology on Die and Stamping Business

    NASA Astrophysics Data System (ADS)

    Stevens, Mark W.

    2005-08-01

    Over the last ten years, we have seen an explosion in the use of simulation-based techniques to improve the engineering, construction, and operation of GM production tools. The impact has been as profound as the overall switch to CAD/CAM from the old manual design and construction methods. The changeover to N/C machining from duplicating milling machines brought advances in accuracy and speed to our construction activity. It also brought significant reductions in fitting sculptured surfaces. Changing over to CAD design brought similar advances in accuracy, and today's use of solid modeling has enhanced that accuracy gain while finally leading to the reduction in lead time and cost through the development of parametric techniques. Elimination of paper drawings for die design, along with the process of blueprinting and distribution, provided the savings required to install high capacity computer servers, high-speed data transmission lines and integrated networks. These historic changes in the application of CAE technology in manufacturing engineering paved the way for the implementation of simulation to all aspects of our business. The benefits are being realized now, and the future holds even greater promise as the simulation techniques mature and expand. Every new line of dies is verified prior to casting for interference free operation. Sheet metal forming simulation validates the material flow, eliminating the high costs of physical experimentation dependent on trial and error methods of the past. Integrated forming simulation and die structural analysis and optimization has led to a reduction in die size and weight on the order of 30% or more. The latest techniques in factory simulation enable analysis of automated press lines, including all stamping operations with corresponding automation. This leads to manufacturing lines capable of running at higher levels of throughput, with actual results providing the capability of two or more additional strokes per minute. As we spread these simulation techniques to the balance of our business, from blank de-stacking to the racking of parts, we anticipate continued reduction in lead-time and engineering expense while improving quality and start-up execution. The author will provide an overview of technology and business evolution of the math-based process that brought an historical transition and revitalization to the die and stamping industry in the past decade. Finally, the author will give an outlook for future business needs and technology development directions.

  17. Modeling a maintenance simulation of the geosynchronous platform

    NASA Technical Reports Server (NTRS)

    Kleiner, A. F., Jr.

    1980-01-01

    A modeling technique used to conduct a simulation study comparing various maintenance routines for a space platform is dicussed. A system model is described and illustrated, the basic concepts of a simulation pass are detailed, and sections on failures and maintenance are included. The operation of the system across time is best modeled by a discrete event approach with two basic events - failure and maintenance of the system. Each overall simulation run consists of introducing a particular model of the physical system, together with a maintenance policy, demand function, and mission lifetime. The system is then run through many passes, each pass corresponding to one mission and the model is re-initialized before each pass. Statistics are compiled at the end of each pass and after the last pass a report is printed. Items of interest typically include the time to first maintenance, total number of maintenance trips for each pass, average capability of the system, etc.

  18. Real-time maritime scene simulation for ladar sensors

    NASA Astrophysics Data System (ADS)

    Christie, Chad L.; Gouthas, Efthimios; Swierkowski, Leszek; Williams, Owen M.

    2011-06-01

    Continuing interest exists in the development of cost-effective synthetic environments for testing Laser Detection and Ranging (ladar) sensors. In this paper we describe a PC-based system for real-time ladar scene simulation of ships and small boats in a dynamic maritime environment. In particular, we describe the techniques employed to generate range imagery accompanied by passive radiance imagery. Our ladar scene generation system is an evolutionary extension of the VIRSuite infrared scene simulation program and includes all previous features such as ocean wave simulation, the physically-realistic representation of boat and ship dynamics, wake generation and simulation of whitecaps, spray, wake trails and foam. A terrain simulation extension is also under development. In this paper we outline the development, capabilities and limitations of the VIRSuite extensions.

  19. A chemical EOR benchmark study of different reservoir simulators

    NASA Astrophysics Data System (ADS)

    Goudarzi, Ali; Delshad, Mojdeh; Sepehrnoori, Kamy

    2016-09-01

    Interest in chemical EOR processes has intensified in recent years due to the advancements in chemical formulations and injection techniques. Injecting Polymer (P), surfactant/polymer (SP), and alkaline/surfactant/polymer (ASP) are techniques for improving sweep and displacement efficiencies with the aim of improving oil production in both secondary and tertiary floods. There has been great interest in chemical flooding recently for different challenging situations. These include high temperature reservoirs, formations with extreme salinity and hardness, naturally fractured carbonates, and sandstone reservoirs with heavy and viscous crude oils. More oil reservoirs are reaching maturity where secondary polymer floods and tertiary surfactant methods have become increasingly important. This significance has added to the industry's interest in using reservoir simulators as tools for reservoir evaluation and management to minimize costs and increase the process efficiency. Reservoir simulators with special features are needed to represent coupled chemical and physical processes present in chemical EOR processes. The simulators need to be first validated against well controlled lab and pilot scale experiments to reliably predict the full field implementations. The available data from laboratory scale include 1) phase behavior and rheological data; and 2) results of secondary and tertiary coreflood experiments for P, SP, and ASP floods under reservoir conditions, i.e. chemical retentions, pressure drop, and oil recovery. Data collected from corefloods are used as benchmark tests comparing numerical reservoir simulators with chemical EOR modeling capabilities such as STARS of CMG, ECLIPSE-100 of Schlumberger, REVEAL of Petroleum Experts. The research UTCHEM simulator from The University of Texas at Austin is also included since it has been the benchmark for chemical flooding simulation for over 25 years. The results of this benchmark comparison will be utilized to improve chemical design for field-scale studies using commercial simulators. The benchmark tests illustrate the potential of commercial simulators for chemical flooding projects and provide a comprehensive table of strengths and limitations of each simulator for a given chemical EOR process. Mechanistic simulations of chemical EOR processes will provide predictive capability and can aid in optimization of the field injection projects. The objective of this paper is not to compare the computational efficiency and solution algorithms; it only focuses on the process modeling comparison.

  20. Improved importance sampling technique for efficient simulation of digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  1. Generalized Green's function molecular dynamics for canonical ensemble simulations

    NASA Astrophysics Data System (ADS)

    Coluci, V. R.; Dantas, S. O.; Tewary, V. K.

    2018-05-01

    The need of small integration time steps (˜1 fs) in conventional molecular dynamics simulations is an important issue that inhibits the study of physical, chemical, and biological systems in real timescales. Additionally, to simulate those systems in contact with a thermal bath, thermostating techniques are usually applied. In this work, we generalize the Green's function molecular dynamics technique to allow simulations within the canonical ensemble. By applying this technique to one-dimensional systems, we were able to correctly describe important thermodynamic properties such as the temperature fluctuations, the temperature distribution, and the velocity autocorrelation function. We show that the proposed technique also allows the use of time steps one order of magnitude larger than those typically used in conventional molecular dynamics simulations. We expect that this technique can be used in long-timescale molecular dynamics simulations.

  2. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  3. Spacecraft VHF Radio Propagation Analysis in Ocean Environments Including Atmospheric Effects

    NASA Technical Reports Server (NTRS)

    Hwu, Shian; Moreno, Gerardo; Desilva, Kanishka; Jih, CIndy

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the National Aeronautics and Space Administration (NASA)/Johnson Space Center (JSC) is tasked to perform spacecraft and ground network communication system simulations. The CSSL has developed simulation tools that model spacecraft communication systems and the space/ground environment in which they operate. This paper is to analyze a spacecraft's very high frequency (VHF) radio signal propagation and the impact to performance when landing in an ocean. Very little research work has been done for VHF radio systems in a maritime environment. Rigorous Radio Frequency (RF) modeling/simulation techniques were employed for various environmental effects. The simulation results illustrate the significance of the environmental effects on the VHF radio system performance.

  4. Simulation in teaching regional anesthesia: current perspectives.

    PubMed

    Udani, Ankeet D; Kim, T Edward; Howard, Steven K; Mariano, Edward R

    2015-01-01

    The emerging subspecialty of regional anesthesiology and acute pain medicine represents an opportunity to evaluate critically the current methods of teaching regional anesthesia techniques and the practice of acute pain medicine. To date, there have been a wide variety of simulation applications in this field, and efficacy has largely been assumed. However, a thorough review of the literature reveals that effective teaching strategies, including simulation, in regional anesthesiology and acute pain medicine are not established completely yet. Future research should be directed toward comparative-effectiveness of simulation versus other accepted teaching methods, exploring the combination of procedural training with realistic clinical scenarios, and the application of simulation-based teaching curricula to a wider range of learner, from the student to the practicing physician.

  5. Simulation in teaching regional anesthesia: current perspectives

    PubMed Central

    Udani, Ankeet D; Kim, T Edward; Howard, Steven K; Mariano, Edward R

    2015-01-01

    The emerging subspecialty of regional anesthesiology and acute pain medicine represents an opportunity to evaluate critically the current methods of teaching regional anesthesia techniques and the practice of acute pain medicine. To date, there have been a wide variety of simulation applications in this field, and efficacy has largely been assumed. However, a thorough review of the literature reveals that effective teaching strategies, including simulation, in regional anesthesiology and acute pain medicine are not established completely yet. Future research should be directed toward comparative-effectiveness of simulation versus other accepted teaching methods, exploring the combination of procedural training with realistic clinical scenarios, and the application of simulation-based teaching curricula to a wider range of learner, from the student to the practicing physician. PMID:26316812

  6. The Instagram: A Novel Sounding Technique for Enhanced HF Propagation Advice

    DTIC Science & Technology

    2010-05-01

    precautions are necessary before such a scheme is attempted, and an ultimate aim might be to have the technique as close to subliminal as possible...waveform on reception has been reduced to subliminal levels. Figure 4 PSD plot of the weighted instagram waveform as received on a...wideband system with simulated background noise level included. Although the signal appears subliminal to other users it can still be extracted with

  7. A review of the use of simulation in dental education.

    PubMed

    Perry, Suzanne; Bridges, Susan Margaret; Burrow, Michael Francis

    2015-02-01

    In line with the advances in technology and communication, medical simulations are being developed to support the acquisition of requisite psychomotor skills before real-life clinical applications. This review article aimed to give a general overview of simulation in a cognate field, clinical dental education. Simulations in dentistry are not a new phenomenon; however, recent developments in virtual-reality technology using computer-generated medical simulations of 3-dimensional images or environments are providing more optimal practice conditions to smooth the transition from the traditional model-based simulation laboratory to the clinic. Evidence as to the positive aspects of virtual reality include increased effectiveness in comparison with traditional simulation teaching techniques, more efficient learning, objective and reproducible feedback, unlimited training hours, and enhanced cost-effectiveness for teaching establishments. Negative aspects have been indicated as initial setup costs, faculty training, and the lack of a variety of content and current educational simulation programs.

  8. GPU-accelerated computational tool for studying the effectiveness of asteroid disruption techniques

    NASA Astrophysics Data System (ADS)

    Zimmerman, Ben J.; Wie, Bong

    2016-10-01

    This paper presents the development of a new Graphics Processing Unit (GPU) accelerated computational tool for asteroid disruption techniques. Numerical simulations are completed using the high-order spectral difference (SD) method. Due to the compact nature of the SD method, it is well suited for implementation with the GPU architecture, hence solutions are generated at orders of magnitude faster than the Central Processing Unit (CPU) counterpart. A multiphase model integrated with the SD method is introduced, and several asteroid disruption simulations are conducted, including kinetic-energy impactors, multi-kinetic energy impactor systems, and nuclear options. Results illustrate the benefits of using multi-kinetic energy impactor systems when compared to a single impactor system. In addition, the effectiveness of nuclear options is observed.

  9. Hydrogeologic unit flow characterization using transition probability geostatistics.

    PubMed

    Jones, Norman L; Walker, Justin R; Carle, Steven F

    2005-01-01

    This paper describes a technique for applying the transition probability geostatistics method for stochastic simulation to a MODFLOW model. Transition probability geostatistics has some advantages over traditional indicator kriging methods including a simpler and more intuitive framework for interpreting geologic relationships and the ability to simulate juxtapositional tendencies such as fining upward sequences. The indicator arrays generated by the transition probability simulation are converted to layer elevation and thickness arrays for use with the new Hydrogeologic Unit Flow package in MODFLOW 2000. This makes it possible to preserve complex heterogeneity while using reasonably sized grids and/or grids with nonuniform cell thicknesses.

  10. Faster simulation plots

    NASA Technical Reports Server (NTRS)

    Fowell, Richard A.

    1989-01-01

    Most simulation plots are heavily oversampled. Ignoring unnecessary data points dramatically reduces plot time with imperceptible effect on quality. The technique is suited to most plot devices. The departments laser printer's speed was tripled for large simulation plots by data thinning. This reduced printer delays without the expense of a faster laser printer. Surpisingly, it saved computer time as well. All plot data are now thinned, including PostScript and terminal plots. The problem, solution, and conclusions are described. The thinning algorithm is described and performance studies are presented. To obtain FORTRAN 77 or C source listings, mail a SASE to the author.

  11. Seventh symposium on systems analysis in forest resources; 1997 May 28-31; Traverse City, MI.

    Treesearch

    J. Michael Vasievich; Jeremy S. Fried; Larry A. Leefers

    2000-01-01

    This international symposium included presentations by representatives from government, academic, and private institutions. Topics covered management objectives; information systems: modeling, optimization, simulation and decision support techniques; spatial methods; timber supply; and economic and operational analyses.

  12. GERTS GQ User's Manual.

    ERIC Educational Resources Information Center

    Akiba, Y.; And Others

    This user's manual for the simulation program Graphical Evaluation and Review Technique (GERT) GQ contains sections on nodes, branches, program input description and format, and program output, as well as examples. Also included is a programmer's manual which contains information on scheduling, subroutine descriptions, COMMON Variables, and…

  13. Accelerating the Gillespie Exact Stochastic Simulation Algorithm using hybrid parallel execution on graphics processing units.

    PubMed

    Komarov, Ivan; D'Souza, Roshan M

    2012-01-01

    The Gillespie Stochastic Simulation Algorithm (GSSA) and its variants are cornerstone techniques to simulate reaction kinetics in situations where the concentration of the reactant is too low to allow deterministic techniques such as differential equations. The inherent limitations of the GSSA include the time required for executing a single run and the need for multiple runs for parameter sweep exercises due to the stochastic nature of the simulation. Even very efficient variants of GSSA are prohibitively expensive to compute and perform parameter sweeps. Here we present a novel variant of the exact GSSA that is amenable to acceleration by using graphics processing units (GPUs). We parallelize the execution of a single realization across threads in a warp (fine-grained parallelism). A warp is a collection of threads that are executed synchronously on a single multi-processor. Warps executing in parallel on different multi-processors (coarse-grained parallelism) simultaneously generate multiple trajectories. Novel data-structures and algorithms reduce memory traffic, which is the bottleneck in computing the GSSA. Our benchmarks show an 8×-120× performance gain over various state-of-the-art serial algorithms when simulating different types of models.

  14. A multi-scaled approach for simulating chemical reaction systems.

    PubMed

    Burrage, Kevin; Tian, Tianhai; Burrage, Pamela

    2004-01-01

    In this paper we give an overview of some very recent work, as well as presenting a new approach, on the stochastic simulation of multi-scaled systems involving chemical reactions. In many biological systems (such as genetic regulation and cellular dynamics) there is a mix between small numbers of key regulatory proteins, and medium and large numbers of molecules. In addition, it is important to be able to follow the trajectories of individual molecules by taking proper account of the randomness inherent in such a system. We describe different types of simulation techniques (including the stochastic simulation algorithm, Poisson Runge-Kutta methods and the balanced Euler method) for treating simulations in the three different reaction regimes: slow, medium and fast. We then review some recent techniques on the treatment of coupled slow and fast reactions for stochastic chemical kinetics and present a new approach which couples the three regimes mentioned above. We then apply this approach to a biologically inspired problem involving the expression and activity of LacZ and LacY proteins in E. coli, and conclude with a discussion on the significance of this work. Copyright 2004 Elsevier Ltd.

  15. Measurements of Deposition, Lung Surface Area and Lung Fluid for Simulation of Inhaled Compounds

    PubMed Central

    Fröhlich, Eleonore; Mercuri, Annalisa; Wu, Shengqian; Salar-Behzadi, Sharareh

    2016-01-01

    Modern strategies in drug development employ in silico techniques in the design of compounds as well as estimations of pharmacokinetics, pharmacodynamics and toxicity parameters. The quality of the results depends on software algorithm, data library and input data. Compared to simulations of absorption, distribution, metabolism, excretion, and toxicity of oral drug compounds, relatively few studies report predictions of pharmacokinetics and pharmacodynamics of inhaled substances. For calculation of the drug concentration at the absorption site, the pulmonary epithelium, physiological parameters such as lung surface and distribution volume (lung lining fluid) have to be known. These parameters can only be determined by invasive techniques and by postmortem studies. Very different values have been reported in the literature. This review addresses the state of software programs for simulation of orally inhaled substances and focuses on problems in the determination of particle deposition, lung surface and of lung lining fluid. The different surface areas for deposition and for drug absorption are difficult to include directly into the simulations. As drug levels are influenced by multiple parameters the role of single parameters in the simulations cannot be identified easily. PMID:27445817

  16. ORAC: a molecular dynamics simulation program to explore free energy surfaces in biomolecular systems at the atomistic level.

    PubMed

    Marsili, Simone; Signorini, Giorgio Federico; Chelli, Riccardo; Marchi, Massimo; Procacci, Piero

    2010-04-15

    We present the new release of the ORAC engine (Procacci et al., Comput Chem 1997, 18, 1834), a FORTRAN suite to simulate complex biosystems at the atomistic level. The previous release of the ORAC code included multiple time steps integration, smooth particle mesh Ewald method, constant pressure and constant temperature simulations. The present release has been supplemented with the most advanced techniques for enhanced sampling in atomistic systems including replica exchange with solute tempering, metadynamics and steered molecular dynamics. All these computational technologies have been implemented for parallel architectures using the standard MPI communication protocol. ORAC is an open-source program distributed free of charge under the GNU general public license (GPL) at http://www.chim.unifi.it/orac. 2009 Wiley Periodicals, Inc.

  17. Telerobotic Surgery: An Intelligent Systems Approach to Mitigate the Adverse Effects of Communication Delay. Chapter 4

    NASA Technical Reports Server (NTRS)

    Cardullo, Frank M.; Lewis, Harold W., III; Panfilov, Peter B.

    2007-01-01

    An extremely innovative approach has been presented, which is to have the surgeon operate through a simulator running in real-time enhanced with an intelligent controller component to enhance the safety and efficiency of a remotely conducted operation. The use of a simulator enables the surgeon to operate in a virtual environment free from the impediments of telecommunication delay. The simulator functions as a predictor and periodically the simulator state is corrected with truth data. Three major research areas must be explored in order to ensure achieving the objectives. They are: simulator as predictor, image processing, and intelligent control. Each is equally necessary for success of the project and each of these involves a significant intelligent component in it. These are diverse, interdisciplinary areas of investigation, thereby requiring a highly coordinated effort by all the members of our team, to ensure an integrated system. The following is a brief discussion of those areas. Simulator as a predictor: The delays encountered in remote robotic surgery will be greater than any encountered in human-machine systems analysis, with the possible exception of remote operations in space. Therefore, novel compensation techniques will be developed. Included will be the development of the real-time simulator, which is at the heart of our approach. The simulator will present real-time, stereoscopic images and artificial haptic stimuli to the surgeon. Image processing: Because of the delay and the possibility of insufficient bandwidth a high level of novel image processing is necessary. This image processing will include several innovative aspects, including image interpretation, video to graphical conversion, texture extraction, geometric processing, image compression and image generation at the surgeon station. Intelligent control: Since the approach we propose is in a sense predictor based, albeit a very sophisticated predictor, a controller, which not only optimizes end effector trajectory but also avoids error, is essential. We propose to investigate two different approaches to the controller design. One approach employs an optimal controller based on modern control theory; the other one involves soft computing techniques, i.e. fuzzy logic, neural networks, genetic algorithms and hybrids of these.

  18. Discrete Optimization of Electronic Hyperpolarizabilities in a Chemical Subspace

    DTIC Science & Technology

    2009-05-01

    molecular design. Methods for optimization in discrete spaces have been studied extensively and recently reviewed ( 5). Optimization methods include...integer programming, as in branch-and-bound techniques (including dead-end elimination [ 6]), simulated annealing ( 7), and genetic algorithms ( 8...These algorithms have found renewed interest and application in molecular and materials design (9- 12) . Recently, new approaches have been

  19. Advanced EUV mask and imaging modeling

    NASA Astrophysics Data System (ADS)

    Evanschitzky, Peter; Erdmann, Andreas

    2017-10-01

    The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.

  20. Simulation training: a systematic review of simulation in arthroscopy and proposal of a new competency-based training framework.

    PubMed

    Tay, Charison; Khajuria, Ankur; Gupte, Chinmay

    2014-01-01

    Traditional orthopaedic training has followed an apprenticeship model whereby trainees enhance their skills by operating under guidance. However the introduction of limitations on training hours and shorter training programmes mean that alternative training strategies are required. To perform a literature review on simulation training in arthroscopy and devise a framework that structures different simulation techniques that could be used in arthroscopic training. A systematic search of Medline, Embase, Google Scholar and the Cochrane Databases were performed. Search terms included "virtual reality OR simulator OR simulation" and "arthroscopy OR arthroscopic". 14 studies evaluating simulators in knee, shoulder and hip arthroplasty were included. The majority of the studies demonstrated construct and transference validity but only one showed concurrent validity. More studies are required to assess its potential as a training and assessment tool, skills transference between simulators and to determine the extent of skills decay from prolonged delays in training. We also devised a "ladder of arthroscopic simulation" that provides a competency-based framework to implement different simulation strategies. The incorporation of simulation into an orthopaedic curriculum will depend on a coordinated approach between many bodies. But the successful integration of simulators in other areas of surgery supports a possible role for simulation in advancing orthopaedic education. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  1. Analysis and application of Fourier transform spectroscopy in atmospheric remote sensing

    NASA Technical Reports Server (NTRS)

    Park, J. H.

    1984-01-01

    An analysis method for Fourier transform spectroscopy is summarized with applications to various types of distortion in atmospheric absorption spectra. This analysis method includes the fast Fourier transform method for simulating the interferometric spectrum and the nonlinear least-squares method for retrieving the information from a measured spectrum. It is shown that spectral distortions can be simulated quite well and that the correct information can be retrieved from a distorted spectrum by this analysis technique.

  2. Airborne Chemical Sensing with Mobile Robots

    PubMed Central

    Lilienthal, Achim J.; Loutfi, Amy; Duckett, Tom

    2006-01-01

    Airborne chemical sensing with mobile robots has been an active research area since the beginning of the 1990s. This article presents a review of research work in this field, including gas distribution mapping, trail guidance, and the different subtasks of gas source localisation. Due to the difficulty of modelling gas distribution in a real world environment with currently available simulation techniques, we focus largely on experimental work and do not consider publications that are purely based on simulations.

  3. Verification of Spin Magnetic Attitude Control System using air-bearing-based attitude control simulator

    NASA Astrophysics Data System (ADS)

    Ousaloo, H. S.; Nodeh, M. T.; Mehrabian, R.

    2016-09-01

    This paper accomplishes one goal and it was to verify and to validate a Spin Magnetic Attitude Control System (SMACS) program and to perform Hardware-In-the-Loop (HIL) air-bearing experiments. A study of a closed-loop magnetic spin controller is presented using only magnetic rods as actuators. The magnetic spin rate control approach is able to perform spin rate control and it is verified with an Attitude Control System (ACS) air-bearing MATLAB® SIMULINK® model and a hardware-embedded LABVIEW® algorithm that controls the spin rate of the test platform on a spherical air bearing table. The SIMULINK® model includes dynamic model of air-bearing, its disturbances, actuator emulation and the time delays caused by on-board calculations. The air-bearing simulator is employed to develop, improve, and carry out objective tests of magnetic torque rods and spin rate control algorithm in the experimental framework and to provide a more realistic demonstration of expected performance of attitude control as compared with software-based architectures. Six sets of two torque rods are used as actuators for the SMACS. It is implemented and simulated to fulfill mission requirement including spin the satellite up to 12 degs-1 around the z-axis. These techniques are documented for the full nonlinear equations of motion of the system and the performances of these techniques are compared in several simulations.

  4. Perceptually relevant parameters for virtual listening simulation of small room acoustics

    PubMed Central

    Zahorik, Pavel

    2009-01-01

    Various physical aspects of room-acoustic simulation techniques have been extensively studied and refined, yet the perceptual attributes of the simulations have received relatively little attention. Here a method of evaluating the perceptual similarity between rooms is described and tested using 15 small-room simulations based on binaural room impulse responses (BRIRs) either measured from a real room or estimated using simple geometrical acoustic modeling techniques. Room size and surface absorption properties were varied, along with aspects of the virtual simulation including the use of individualized head-related transfer function (HRTF) measurements for spatial rendering. Although differences between BRIRs were evident in a variety of physical parameters, a multidimensional scaling analysis revealed that when at-the-ear signal levels were held constant, the rooms differed along just two perceptual dimensions: one related to reverberation time (T60) and one related to interaural coherence (IACC). Modeled rooms were found to differ from measured rooms in this perceptual space, but the differences were relatively small and should be easily correctable through adjustment of T60 and IACC in the model outputs. Results further suggest that spatial rendering using individualized HRTFs offers little benefit over nonindividualized HRTF rendering for room simulation applications where source direction is fixed. PMID:19640043

  5. A Cartesian cut cell method for rarefied flow simulations around moving obstacles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dechristé, G., E-mail: Guillaume.Dechriste@math.u-bordeaux1.fr; CNRS, IMB, UMR 5251, F-33400 Talence; Mieussens, L., E-mail: Luc.Mieussens@math.u-bordeaux1.fr

    2016-06-01

    For accurate simulations of rarefied gas flows around moving obstacles, we propose a cut cell method on Cartesian grids: it allows exact conservation and accurate treatment of boundary conditions. Our approach is designed to treat Cartesian cells and various kinds of cut cells by the same algorithm, with no need to identify the specific shape of each cut cell. This makes the implementation quite simple, and allows a direct extension to 3D problems. Such simulations are also made possible by using an adaptive mesh refinement technique and a hybrid parallel implementation. This is illustrated by several test cases, including amore » 3D unsteady simulation of the Crookes radiometer.« less

  6. An orbit simulation study of a geopotential research mission including satellite-to-satellite tracking and disturbance compensation systems

    NASA Technical Reports Server (NTRS)

    Antreasian, Peter G.

    1988-01-01

    Two orbit simulations, one representing the actual Geopotential Research Mission (GRM) orbit and the other representing the orbit estimated from orbit determination techniques, are presented. A computer algorithm was created to simulate GRM's drag compensation mechanism so the fuel expenditure and proof mass trajectories relative to the spacecraft centroid could be calculated for the mission. The results of the GRM DISCOS simulation demonstrated that the spacecraft can essentially be drag-free. The results showed that the centroid of the spacecraft can be controlled so that it will not deviate more than 1.0 mm in any direction from the centroid of the proof mass.

  7. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  8. Simulation of Mirror Electron Microscopy Caustic Images in Three-Dimensions

    NASA Astrophysics Data System (ADS)

    Kennedy, S. M.; Zheng, C. X.; Jesson, D. E.

    A full, three-dimensional (3D) ray tracing approach is developed to simulate the caustics visible in mirror electron microscopy (MEM). The method reproduces MEM image contrast resulting from 3D surface relief. To illustrate the potential of the simulation methods, we study the evolution of crater contrast associated with a movie of GaAs structures generated by the droplet epitaxy technique. Specifically, we simulate the image contrast resulting from both a precursor stage and the final crater morphology which is consistent with an inverted pyramid consisting of (111) facet walls. The method therefore facilities the study of how self-assembled quantum structures evolve with time and, in particular, the development of anisotropic features including faceting.

  9. Simulation of ultrasonic arrays for industrial and civil engineering applications including validation

    NASA Astrophysics Data System (ADS)

    Spies, M.; Rieder, H.; Orth, Th.; Maack, S.

    2012-05-01

    In this contribution we address the beam field simulation of 2D ultrasonic arrays using the Generalized Point Source Synthesis technique. Aiming at the inspection of cylindrical components (e.g. pipes) the influence of concave and convex surface curvatures, respectively, has been evaluated for a commercial probe. We have compared these results with those obtained using a commercial simulation tool. In civil engineering, the ultrasonic inspection of highly attenuating concrete structures has been advanced by the development of dry contact point transducers, mainly applied in array arrangements. Our respective simulations for a widely used commercial probe are validated using experimental results acquired on concrete half-spheres with diameters from 200 mm up to 650 mm.

  10. [Existing laparoscopic simulators and their benefit for the surgeon].

    PubMed

    Kalvach, J; Ryska, O; Ryska, M

    2016-01-01

    Nowadays, laparoscopic operations are a common part of surgical practice. However, they have their own characteristics and require a specific method of preparation. Recently, simulation techniques have been increasingly used for the training of skills. The aim of this review is to provide a summary of available literature on the topic of laparoscopic simulators, to assess their contribution to the training of surgeons, and to identify the most effective type of simulation. PubMed database, Web of Science and Cochrane Library were used to search for relevant publications. The keywords "laparoscopy, simulator, surgery, assessment" were used in the search. The search was limited to prospective studies published in the last 5 years in the English language. From a total of 354 studies found, we included in the survey 26 that matched our criteria. Nine studies compared individual simulators to one another. Five studies evaluated "high and low fidelity" (a virtual box simulator) as equally effective (EBM 2a). In three cases the "low fidelity" box simulator was found to be more efficient (EBM 2a3b). Only one study preferred the virtual simulator (VR) (EBM2b).Thirteen studies evaluated the benefits of simulators for practice. Twelve found training on a simulator to be an effective method of preparation (EBM 1b3b). In contrast, one study did not find any difference between the training simulator and traditional preparation (EBM 3b). Nine studies evaluated directly one of the methods of evaluating laparoscopic skills. Three studies evaluated VR simulator as a useful assessment tool. Other studies evaluated as successful the scoring system GOALS-GH. The hand motion analysis model was successful in one case. Most studies were observational (EBM 3b) and only 2 studies were of higher quality (EBM 2b). Simulators are an effective tool for practicing laparoscopic techniques (EBM: 1b). It cannot be determined based on available data which of the simulators is most effective. The virtual simulator, however, still remains the most self-sufficient unit suitable for teaching as well as evaluation of laparoscopic techniques (EBM 2b3b). Further studies are needed to find an effective system and parameters for an objective evaluation of skills. laparoscopy - simulator - surgery assessment.

  11. BMP analysis system for watershed-based stormwater management.

    PubMed

    Zhen, Jenny; Shoemaker, Leslie; Riverson, John; Alvi, Khalid; Cheng, Mow-Soung

    2006-01-01

    Best Management Practices (BMPs) are measures for mitigating nonpoint source (NPS) pollution caused mainly by stormwater runoff. Established urban and newly developing areas must develop cost effective means for restoring or minimizing impacts, and planning future growth. Prince George's County in Maryland, USA, a fast-growing region in the Washington, DC metropolitan area, has developed a number of tools to support analysis and decision making for stormwater management planning and design at the watershed level. These tools support watershed analysis, innovative BMPs, and optimization. Application of these tools can help achieve environmental goals and lead to significant cost savings. This project includes software development that utilizes GIS information and technology, integrates BMP processes simulation models, and applies system optimization techniques for BMP planning and selection. The system employs the ESRI ArcGIS as the platform, and provides GIS-based visualization and support for developing networks including sequences of land uses, BMPs, and stream reaches. The system also provides interfaces for BMP placement, BMP attribute data input, and decision optimization management. The system includes a stand-alone BMP simulation and evaluation module, which complements both research and regulatory nonpoint source control assessment efforts, and allows flexibility in the examining various BMP design alternatives. Process based simulation of BMPs provides a technique that is sensitive to local climate and rainfall patterns. The system incorporates a meta-heuristic optimization technique to find the most cost-effective BMP placement and implementation plan given a control target, or a fixed cost. A case study is presented to demonstrate the application of the Prince George's County system. The case study involves a highly urbanized area in the Anacostia River (a tributary to Potomac River) watershed southeast of Washington, DC. An innovative system of management practices is proposed to minimize runoff, improve water quality, and provide water reuse opportunities. Proposed management techniques include bioretention, green roof, and rooftop runoff collection (rain barrel) systems. The modeling system was used to identify the most cost-effective combinations of management practices to help minimize frequency and size of runoff events and resulting combined sewer overflows to the Anacostia River.

  12. Computer simulation of a space SAR using a range-sequential processor for soil moisture mapping

    NASA Technical Reports Server (NTRS)

    Fujita, M.; Ulaby, F. (Principal Investigator)

    1982-01-01

    The ability of a spaceborne synthetic aperture radar (SAR) to detect soil moisture was evaluated by means of a computer simulation technique. The computer simulation package includes coherent processing of the SAR data using a range-sequential processor, which can be set up through hardware implementations, thereby reducing the amount of telemetry involved. With such a processing approach, it is possible to monitor the earth's surface on a continuous basis, since data storage requirements can be easily met through the use of currently available technology. The Development of the simulation package is described, followed by an examination of the application of the technique to actual environments. The results indicate that in estimating soil moisture content with a four-look processor, the difference between the assumed and estimated values of soil moisture is within + or - 20% of field capacity for 62% of the pixels for agricultural terrain and for 53% of the pixels for hilly terrain. The estimation accuracy for soil moisture may be improved by reducing the effect of fading through non-coherent averaging.

  13. An analytical study of reduced-gravity liquid reorientation using a simplified marker and cell technique

    NASA Technical Reports Server (NTRS)

    Betts, W. S., Jr.

    1972-01-01

    A computer program called HOPI was developed to predict reorientation flow dynamics, wherein liquids move from one end of a closed, partially filled, rigid container to the other end under the influence of container acceleration. The program uses the simplified marker and cell numerical technique and, using explicit finite-differencing, solves the Navier-Stokes equations for an incompressible viscous fluid. The effects of turbulence are also simulated in the program. HOPI can consider curved as well as straight walled boundaries. Both free-surface and confined flows can be calculated. The program was used to simulate five liquid reorientation cases. Three of these cases simulated actual NASA LeRC drop tower test conditions while two cases simulated full-scale Centaur tank conditions. It was concluded that while HOPI can be used to analytically determine the fluid motion in a typical settling problem, there is a current need to optimize HOPI. This includes both reducing the computer usage time and also reducing the core storage required for a given size problem.

  14. Progress in fast, accurate multi-scale climate simulations

    DOE PAGES

    Collins, W. D.; Johansen, H.; Evans, K. J.; ...

    2015-06-01

    We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less

  15. REVEAL: An Extensible Reduced Order Model Builder for Simulation and Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agarwal, Khushbu; Sharma, Poorva; Ma, Jinliang

    2013-04-30

    Many science domains need to build computationally efficient and accurate representations of high fidelity, computationally expensive simulations. These computationally efficient versions are known as reduced-order models. This paper presents the design and implementation of a novel reduced-order model (ROM) builder, the REVEAL toolset. This toolset generates ROMs based on science- and engineering-domain specific simulations executed on high performance computing (HPC) platforms. The toolset encompasses a range of sampling and regression methods that can be used to generate a ROM, automatically quantifies the ROM accuracy, and provides support for an iterative approach to improve ROM accuracy. REVEAL is designed to bemore » extensible in order to utilize the core functionality with any simulator that has published input and output formats. It also defines programmatic interfaces to include new sampling and regression techniques so that users can ‘mix and match’ mathematical techniques to best suit the characteristics of their model. In this paper, we describe the architecture of REVEAL and demonstrate its usage with a computational fluid dynamics model used in carbon capture.« less

  16. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  17. Advanced system for 3D dental anatomy reconstruction and 3D tooth movement simulation during orthodontic treatment

    NASA Astrophysics Data System (ADS)

    Monserrat, Carlos; Alcaniz-Raya, Mariano L.; Juan, M. Carmen; Grau Colomer, Vincente; Albalat, Salvador E.

    1997-05-01

    This paper describes a new method for 3D orthodontics treatment simulation developed for an orthodontics planning system (MAGALLANES). We develop an original system for 3D capturing and reconstruction of dental anatomy that avoid use of dental casts in orthodontic treatments. Two original techniques are presented, one direct in which data are acquired directly form patient's mouth by mean of low cost 3D digitizers, and one mixed in which data are obtained by 3D digitizing of hydrocollids molds. FOr this purpose we have designed and manufactured an optimized optical measuring system based on laser structured light. We apply these 3D dental models to simulate 3D movement of teeth, including rotations, during orthodontic treatment. The proposed algorithms enable to quantify the effect of orthodontic appliance on tooth movement. The developed techniques has been integrated in a system named MAGALLANES. This original system present several tools for 3D simulation and planning of orthodontic treatments. The prototype system has been tested in several orthodontic clinic with very good results.

  18. Modeling and Design of GaN High Electron Mobility Transistors and Hot Electron Transistors through Monte Carlo Particle-based Device Simulations

    NASA Astrophysics Data System (ADS)

    Soligo, Riccardo

    In this work, the insight provided by our sophisticated Full Band Monte Carlo simulator is used to analyze the behavior of state-of-art devices like GaN High Electron Mobility Transistors and Hot Electron Transistors. Chapter 1 is dedicated to the description of the simulation tool used to obtain the results shown in this work. Moreover, a separate section is dedicated the set up of a procedure to validate to the tunneling algorithm recently implemented in the simulator. Chapter 2 introduces High Electron Mobility Transistors (HEMTs), state-of-art devices characterized by highly non linear transport phenomena that require the use of advanced simulation methods. The techniques for device modeling are described applied to a recent GaN-HEMT, and they are validated with experimental measurements. The main techniques characterization techniques are also described, including the original contribution provided by this work. Chapter 3 focuses on a popular technique to enhance HEMTs performance: the down-scaling of the device dimensions. In particular, this chapter is dedicated to lateral scaling and the calculation of a limiting cutoff frequency for a device of vanishing length. Finally, Chapter 4 and Chapter 5 describe the modeling of Hot Electron Transistors (HETs). The simulation approach is validated by matching the current characteristics with the experimental one before variations of the layouts are proposed to increase the current gain to values suitable for amplification. The frequency response of these layouts is calculated, and modeled by a small signal circuit. For this purpose, a method to directly calculate the capacitance is developed which provides a graphical picture of the capacitative phenomena that limit the frequency response in devices. In Chapter 5 the properties of the hot electrons are investigated for different injection energies, which are obtained by changing the layout of the emitter barrier. Moreover, the large signal characterization of the HET is shown for different layouts, where the collector barrier was scaled.

  19. An Introduction to 3-D Sound

    NASA Technical Reports Server (NTRS)

    Begault, Durand R.; Null, Cynthia H. (Technical Monitor)

    1997-01-01

    This talk will overview the basic technologies related to the creation of virtual acoustic images, and the potential of including spatial auditory displays in human-machine interfaces. Research into the perceptual error inherent in both natural and virtual spatial hearing is reviewed, since the formation of improved technologies is tied to psychoacoustic research. This includes a discussion of Head Related Transfer Function (HRTF) measurement techniques (the HRTF provides important perceptual cues within a virtual acoustic display). Many commercial applications of virtual acoustics have so far focused on games and entertainment ; in this review, other types of applications are examined, including aeronautic safety, voice communications, virtual reality, and room acoustic simulation. In particular, the notion that realistic simulation is optimized within a virtual acoustic display when head motion and reverberation cues are included within a perceptual model.

  20. Simulating New Drop Test Vehicles and Test Techniques for the Orion CEV Parachute Assembly System

    NASA Technical Reports Server (NTRS)

    Morris, Aaron L.; Fraire, Usbaldo, Jr.; Bledsoe, Kristin J.; Ray, Eric; Moore, Jim W.; Olson, Leah M.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) project is engaged in a multi-year design and test campaign to qualify a parachute recovery system for human use on the Orion Spacecraft. Test and simulation techniques have evolved concurrently to keep up with the demands of a challenging and complex system. The primary simulations used for preflight predictions and post-test data reconstructions are Decelerator System Simulation (DSS), Decelerator System Simulation Application (DSSA), and Drop Test Vehicle Simulation (DTV-SIM). The goal of this paper is to provide a roadmap to future programs on the test technique challenges and obstacles involved in executing a large-scale, multi-year parachute test program. A focus on flight simulation modeling and correlation to test techniques executed to obtain parachute performance parameters are presented.

  1. Open source Matrix Product States: Opening ways to simulate entangled many-body quantum systems in one dimension

    NASA Astrophysics Data System (ADS)

    Jaschke, Daniel; Wall, Michael L.; Carr, Lincoln D.

    2018-04-01

    Numerical simulations are a powerful tool to study quantum systems beyond exactly solvable systems lacking an analytic expression. For one-dimensional entangled quantum systems, tensor network methods, amongst them Matrix Product States (MPSs), have attracted interest from different fields of quantum physics ranging from solid state systems to quantum simulators and quantum computing. Our open source MPS code provides the community with a toolset to analyze the statics and dynamics of one-dimensional quantum systems. Here, we present our open source library, Open Source Matrix Product States (OSMPS), of MPS methods implemented in Python and Fortran2003. The library includes tools for ground state calculation and excited states via the variational ansatz. We also support ground states for infinite systems with translational invariance. Dynamics are simulated with different algorithms, including three algorithms with support for long-range interactions. Convenient features include built-in support for fermionic systems and number conservation with rotational U(1) and discrete Z2 symmetries for finite systems, as well as data parallelism with MPI. We explain the principles and techniques used in this library along with examples of how to efficiently use the general interfaces to analyze the Ising and Bose-Hubbard models. This description includes the preparation of simulations as well as dispatching and post-processing of them.

  2. Realistic training scenario simulations and simulation techniques

    DOEpatents

    Dunlop, William H.; Koncher, Tawny R.; Luke, Stanley John; Sweeney, Jerry Joseph; White, Gregory K.

    2017-12-05

    In one embodiment, a system includes a signal generator operatively coupleable to one or more detectors; and a controller, the controller being both operably coupled to the signal generator and configured to cause the signal generator to: generate one or more signals each signal being representative of at least one emergency event; and communicate one or more of the generated signal(s) to a detector to which the signal generator is operably coupled. In another embodiment, a method includes: receiving data corresponding to one or more emergency events; generating at least one signal based on the data; and communicating the generated signal(s) to a detector.

  3. Analysis of the Lenticular Jointed MARSIS Antenna Deployment

    NASA Technical Reports Server (NTRS)

    Mobrem, Mehran; Adams, Douglas S.

    2006-01-01

    This paper summarizes important milestones in a yearlong comprehensive effort which culminated in successful deployments of the MARSIS antenna booms in May and June of 2005. Experimentally measured straight section and hinge properties are incorporated into specialized modeling techniques that are used to simulate the boom lenticular joints. System level models are exercised to understand the boom deployment dynamics and spacecraft level implications. Discussion includes a comparison of ADAMS simulation results to measured flight data taken during the three boom deployments. Important parameters that govern lenticular joint behavior are outlined and a short summary of lessons learned and recommendations is included to better understand future applications of this technology.

  4. Damage Detection in Composite Structures with Wavenumber Array Data Processing

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Leckey, Cara; Yu, Lingyu

    2013-01-01

    Guided ultrasonic waves (GUW) have the potential to be an efficient and cost-effective method for rapid damage detection and quantification of large structures. Attractive features include sensitivity to a variety of damage types and the capability of traveling relatively long distances. They have proven to be an efficient approach for crack detection and localization in isotropic materials. However, techniques must be pushed beyond isotropic materials in order to be valid for composite aircraft components. This paper presents our study on GUW propagation and interaction with delamination damage in composite structures using wavenumber array data processing, together with advanced wave propagation simulations. Parallel elastodynamic finite integration technique (EFIT) is used for the example simulations. Multi-dimensional Fourier transform is used to convert time-space wavefield data into frequency-wavenumber domain. Wave propagation in the wavenumber-frequency domain shows clear distinction among the guided wave modes that are present. This allows for extracting a guided wave mode through filtering and reconstruction techniques. Presence of delamination causes spectral change accordingly. Results from 3D CFRP guided wave simulations with delamination damage in flat-plate specimens are used for wave interaction with structural defect study.

  5. Transoesophageal detection of heart graft rejection by electrical impedance: Using finite element method simulations

    NASA Astrophysics Data System (ADS)

    Giovinazzo, G.; Ribas, N.; Cinca, J.; Rosell-Ferrer, J.

    2010-04-01

    Previous studies have shown that it is possible to evaluate heart graft rejection level using a bioimpedance technique by means of an intracavitary catheter. However, this technique does not present relevant advantages compared to the gold standard for the detection of a heart rejection, which is the biopsy of the endomyocardial tissue. We propose to use a less invasive technique that consists in the use of a transoesophageal catheter and two standard ECG electrodes on the thorax. The aim of this work is to evaluate different parameters affecting the impedance measurement, including: sensitivity to electrical conductivity and permittivity of different organs in the thorax, lung edema and pleural water. From these results, we deduce the best estimator for cardiac rejection detection, and we obtain the tools to identify possible cases of false positive of heart rejection due to other factors. To achieve these objectives we have created a thoracic model and we have simulated, with a FEM program, different situations at the frequencies of 13, 30, 100, 300 and 1000 kHz. Our simulation demonstrates that the phase, at 100 and 300 kHz, has the higher sensitivity to changes in the electrical parameters of the heart muscle.

  6. Evaluation of the phase properties of hydrating cement composite using simulated nanoindentation technique

    NASA Astrophysics Data System (ADS)

    Gautham, S.; Sindu, B. S.; Sasmal, Saptarshi

    2017-10-01

    Properties and distribution of the products formed during the hydration of cementitious composite at the microlevel are investigated using a nanoindentation technique. First, numerical nanoindentation using nonlinear contact mechanics is carried out on three different phase compositions of cement paste, viz. mono-phase Tri-calcium Silicate (C3S), Di-calcium Silicate (C2S) and Calcium-Silicate-Hydrate (CSH) individually), bi-phase (C3S-CSH, C2S-CSH) and multi-phase (more than 10 individual phases including water pores). To reflect the multi-phase characteristics of hydrating cement composite, a discretized multi-phase microstructural model of cement composite during the progression of hydration is developed. Further, a grid indentation technique for simulated nanoindentation is established, and employed to evaluate the mechanical characteristics of the hydrated multi-phase cement paste. The properties obtained from the numerical studies are compared with those obtained from experimental grid nanoindentation. The influence of composition and distribution of individual phase properties on the properties obtained from indentation are closely investigated. The study paves the way to establishing the procedure for simulated grid nanoindentation to evaluate the mechanical properties of heterogeneous composites, and facilitates the design of experimental nanoindentation.

  7. High accuracy switched-current circuits using an improved dynamic mirror

    NASA Technical Reports Server (NTRS)

    Zweigle, G.; Fiez, T.

    1991-01-01

    The switched-current technique, a recently developed circuit approach to analog signal processing, has emerged as an alternative/compliment to the well established switched-capacitor circuit technique. High speed switched-current circuits offer potential cost and power savings over slower switched-capacitor circuits. Accuracy improvements are a primary concern at this stage in the development of the switched-current technique. Use of the dynamic current mirror has produced circuits that are insensitive to transistor matching errors. The dynamic current mirror has been limited by other sources of error including clock-feedthrough and voltage transient errors. In this paper we present an improved switched-current building block using the dynamic current mirror. Utilizing current feedback the errors due to current imbalance in the dynamic current mirror are reduced. Simulations indicate that this feedback can reduce total harmonic distortion by as much as 9 dB. Additionally, we have developed a clock-feedthrough reduction scheme for which simulations reveal a potential 10 dB total harmonic distortion improvement. The clock-feedthrough reduction scheme also significantly reduces offset errors and allows for cancellation with a constant current source. Experimental results confirm the simulated improvements.

  8. Wideband piezoelectric energy harvester for low-frequency application with plucking mechanism

    NASA Astrophysics Data System (ADS)

    Hiraki, Yasuhiro; Masuda, Arata; Ikeda, Naoto; Katsumura, Hidenori; Kagata, Hiroshi; Okumura, Hidenori

    2015-04-01

    Wireless sensor networks need energy harvesting from vibrational environment for their power supply. The conventional resonance type vibration energy harvesters, however, are not always effective for low frequency application. The purpose of this paper is to propose a high efficiency energy harvester for low frequency application by utilizing plucking and SSHI techniques, and to investigate the effects of applying those techniques in terms of the energy harvesting efficiency. First, we derived an approximate formulation of energy harvesting efficiency of the plucking device by theoretical analysis. Next, it was confirmed that the improved efficiency agreed with numerical and experimental results. Also, a parallel SSHI, a switching circuit technique to improve the performance of the harvester was introduced and examined by numerical simulations and experiments. Contrary to the simulated results in which the efficiency was improved from 13.1% to 22.6% by introducing the SSHI circuit, the efficiency obtained in the experiment was only 7.43%. This would due to the internal resistance of the inductors and photo MOS relays on the switching circuit and the simulation including this factor revealed large negative influence of it. This result suggested that the reduction of the switching resistance was significantly important to the implementation of SSHI.

  9. Numerically based design of an orifice plate flowmetering system for human respiratory flow monitoring.

    PubMed

    Fortuna, A O; Gurd, J R

    1999-01-01

    During certain medical procedures, it is important to continuously measure the respiratory flow of a patient, as lack of proper ventilation can cause brain damage and ultimately death. The monitoring of the ventilatory condition of a patient is usually performed with the aid of flowmeters. However, water and other secretions present in the expired air can build up and ultimately block a traditional, restriction-based flowmeter; by using an orifice plate flowmeter, such blockages are minimized. This paper describes the design of an orifice plate flowmetering system including, especially, a description of the numerical and computational techniques adopted in order to simulate human respiratory and sinusoidal air flow across various possible designs for the orifice plate flowmeter device. Parallel computation and multigrid techniques were employed in order to reduce execution time. The simulated orifice plate was later built and tested under unsteady sinusoidal flows. Experimental tests show reasonable agreement with the numerical simulation, thereby reinforcing the general hypothesis that computational exploration of the design space is sufficiently accurate to allow designers of such systems to use this in preference to the more traditional, mechanical prototyping techniques.

  10. Use of HyperCard to Simulate a Tissue Culture Laboratory.

    ERIC Educational Resources Information Center

    Nester, Bradley S.; Turney, Tully H.

    1992-01-01

    Describes the use of a Macintosh computer and HyperCard software to create an introduction to cell culture techniques that closely approximates a hands-on laboratory experiment. Highlights include data acquisition, data analysis, the generation of growth curves, and electronic modeling. (LRW)

  11. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  12. Computational materials design of crystalline solids.

    PubMed

    Butler, Keith T; Frost, Jarvist M; Skelton, Jonathan M; Svane, Katrine L; Walsh, Aron

    2016-11-07

    The modelling of materials properties and processes from first principles is becoming sufficiently accurate as to facilitate the design and testing of new systems in silico. Computational materials science is both valuable and increasingly necessary for developing novel functional materials and composites that meet the requirements of next-generation technology. A range of simulation techniques are being developed and applied to problems related to materials for energy generation, storage and conversion including solar cells, nuclear reactors, batteries, fuel cells, and catalytic systems. Such techniques may combine crystal-structure prediction (global optimisation), data mining (materials informatics) and high-throughput screening with elements of machine learning. We explore the development process associated with computational materials design, from setting the requirements and descriptors to the development and testing of new materials. As a case study, we critically review progress in the fields of thermoelectrics and photovoltaics, including the simulation of lattice thermal conductivity and the search for Pb-free hybrid halide perovskites. Finally, a number of universal chemical-design principles are advanced.

  13. A generalized vortex lattice method for subsonic and supersonic flow applications

    NASA Technical Reports Server (NTRS)

    Miranda, L. R.; Elliot, R. D.; Baker, W. M.

    1977-01-01

    If the discrete vortex lattice is considered as an approximation to the surface-distributed vorticity, then the concept of the generalized principal part of an integral yields a residual term to the vorticity-induced velocity field. The proper incorporation of this term to the velocity field generated by the discrete vortex lines renders the present vortex lattice method valid for supersonic flow. Special techniques for simulating nonzero thickness lifting surfaces and fusiform bodies with vortex lattice elements are included. Thickness effects of wing-like components are simulated by a double (biplanar) vortex lattice layer, and fusiform bodies are represented by a vortex grid arranged on a series of concentrical cylindrical surfaces. The analysis of sideslip effects by the subject method is described. Numerical considerations peculiar to the application of these techniques are also discussed. The method has been implemented in a digital computer code. A users manual is included along with a complete FORTRAN compilation, an executed case, and conversion programs for transforming input for the NASA wave drag program.

  14. Path-integral simulation of solids.

    PubMed

    Herrero, C P; Ramírez, R

    2014-06-11

    The path-integral formulation of the statistical mechanics of quantum many-body systems is described, with the purpose of introducing practical techniques for the simulation of solids. Monte Carlo and molecular dynamics methods for distinguishable quantum particles are presented, with particular attention to the isothermal-isobaric ensemble. Applications of these computational techniques to different types of solids are reviewed, including noble-gas solids (helium and heavier elements), group-IV materials (diamond and elemental semiconductors), and molecular solids (with emphasis on hydrogen and ice). Structural, vibrational, and thermodynamic properties of these materials are discussed. Applications also include point defects in solids (structure and diffusion), as well as nuclear quantum effects in solid surfaces and adsorbates. Different phenomena are discussed, as solid-to-solid and orientational phase transitions, rates of quantum processes, classical-to-quantum crossover, and various finite-temperature anharmonic effects (thermal expansion, isotopic effects, electron-phonon interactions). Nuclear quantum effects are most remarkable in the presence of light atoms, so that especial emphasis is laid on solids containing hydrogen as a constituent element or as an impurity.

  15. Demonstration of automated proximity and docking technologies

    NASA Astrophysics Data System (ADS)

    Anderson, Robert L.; Tsugawa, Roy K.; Bryan, Thomas C.

    An autodock was demonstrated using straightforward techniques and real sensor hardware. A simulation testbed was established and validated. The sensor design was refined with improved optical performance and image processing noise mitigation techniques, and the sensor is ready for production from off-the-shelf components. The autonomous spacecraft architecture is defined. The areas of sensors, docking hardware, propulsion, and avionics are included in the design. The Guidance Navigation and Control architecture and requirements are developed. Modular structures suitable for automated control are used. The spacecraft system manager functions including configuration, resource, and redundancy management are defined. The requirements for autonomous spacecraft executive are defined. High level decisionmaking, mission planning, and mission contingency recovery are a part of this. The next step is to do flight demonstrations. After the presentation the following question was asked. How do you define validation? There are two components to validation definition: software simulation with formal and vigorous validation, and hardware and facility performance validated with respect to software already validated against analytical profile.

  16. Monte Carlo simulation of X-ray imaging and spectroscopy experiments using quadric geometry and variance reduction techniques

    NASA Astrophysics Data System (ADS)

    Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca

    2014-03-01

    The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 83617 No. of bytes in distributed program, including test data, etc.: 1038160 Distribution format: tar.gz Programming language: C++. Computer: Tested on several PCs and on Mac. Operating system: Linux, Mac OS X, Windows (native and cygwin). RAM: It is dependent on the input data but usually between 1 and 10 MB. Classification: 2.5, 21.1. External routines: XrayLib (https://github.com/tschoonj/xraylib/wiki) Nature of problem: Simulation of a wide range of X-ray imaging and spectroscopy experiments using different types of sources and detectors. Solution method: XRMC is a versatile program that is useful for the simulation of a wide range of X-ray imaging and spectroscopy experiments. It enables the simulation of monochromatic and polychromatic X-ray sources, with unpolarised or partially/completely polarised radiation. Single-element detectors as well as two-dimensional pixel detectors can be used in the simulations, with several acquisition options. In the current version of the program, the sample is modelled by combining convex three-dimensional objects demarcated by quadric surfaces, such as planes, ellipsoids and cylinders. The Monte Carlo approach makes XRMC able to accurately simulate X-ray photon transport and interactions with matter up to any order of interaction. The differential cross-sections and all other quantities related to the interaction processes (photoelectric absorption, fluorescence emission, elastic and inelastic scattering) are computed using the xraylib software library, which is currently the most complete and up-to-date software library for X-ray parameters. The use of variance reduction techniques makes XRMC able to reduce the simulation time by several orders of magnitude compared to other general-purpose Monte Carlo simulation programs. Running time: It is dependent on the complexity of the simulation. For the examples distributed with the code, it ranges from less than 1 s to a few minutes.

  17. Reduction and analysis of data collected during the electromagnetic tornado experiment

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.

    1976-01-01

    Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.

  18. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  19. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    PubMed

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  20. The MHD simulation of interplanetary space and heliosphere by using the boundary conditions of time-varying magnetic field and IPS-based plasma

    NASA Astrophysics Data System (ADS)

    Hayashi, K.; Tokumaru, M.; Kojima, M.; Fujiki, K.

    2008-12-01

    We present our new boundary treatment to introduce the temporal variation of the observation-based magnetic field and plasma parameters on the inner boundary sphere (at 30 to 50 Rs) to the MHD simulation of the interplanetary space and the simulation results. The boundary treatment to induce the time-variation of the magnetic field including the radial component is essentially same as shown in our previous AGU meetings and newly modified so that the model can also include the variation of the plasma variables detected by IPS (interplanetary scintillation) observation, a ground-based remote sensing technique for the solar wind plasma. We used the WSO (Wilcox Solar Observatory at Stanford University) for the solar magnetic field input. By using the time-varying boundary condition, smooth variations of heliospheric MHD variables during the several Carrington solar rotation period are obtained. The simulation movie will show how the changes in the inner heliosphere observable by the ground-based instrument propagate outward and affects the outer heliosphere. The simulated MHD variables are compared with the Ulysses in-situ measurement data including ones made during its travel from the Earth to Jupiter for validation, and we obtain better agreements than with the simulation with fixed boundary conditions.

  1. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    NASA Astrophysics Data System (ADS)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  2. Modeling techniques for quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Jirauschek, Christian; Kubis, Tillmann

    2014-03-01

    Quantum cascade lasers are unipolar semiconductor lasers covering a wide range of the infrared and terahertz spectrum. Lasing action is achieved by using optical intersubband transitions between quantized states in specifically designed multiple-quantum-well heterostructures. A systematic improvement of quantum cascade lasers with respect to operating temperature, efficiency, and spectral range requires detailed modeling of the underlying physical processes in these structures. Moreover, the quantum cascade laser constitutes a versatile model device for the development and improvement of simulation techniques in nano- and optoelectronics. This review provides a comprehensive survey and discussion of the modeling techniques used for the simulation of quantum cascade lasers. The main focus is on the modeling of carrier transport in the nanostructured gain medium, while the simulation of the optical cavity is covered at a more basic level. Specifically, the transfer matrix and finite difference methods for solving the one-dimensional Schrödinger equation and Schrödinger-Poisson system are discussed, providing the quantized states in the multiple-quantum-well active region. The modeling of the optical cavity is covered with a focus on basic waveguide resonator structures. Furthermore, various carrier transport simulation methods are discussed, ranging from basic empirical approaches to advanced self-consistent techniques. The methods include empirical rate equation and related Maxwell-Bloch equation approaches, self-consistent rate equation and ensemble Monte Carlo methods, as well as quantum transport approaches, in particular the density matrix and non-equilibrium Green's function formalism. The derived scattering rates and self-energies are generally valid for n-type devices based on one-dimensional quantum confinement, such as quantum well structures.

  3. Modeling techniques for quantum cascade lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jirauschek, Christian; Kubis, Tillmann

    2014-03-15

    Quantum cascade lasers are unipolar semiconductor lasers covering a wide range of the infrared and terahertz spectrum. Lasing action is achieved by using optical intersubband transitions between quantized states in specifically designed multiple-quantum-well heterostructures. A systematic improvement of quantum cascade lasers with respect to operating temperature, efficiency, and spectral range requires detailed modeling of the underlying physical processes in these structures. Moreover, the quantum cascade laser constitutes a versatile model device for the development and improvement of simulation techniques in nano- and optoelectronics. This review provides a comprehensive survey and discussion of the modeling techniques used for the simulation ofmore » quantum cascade lasers. The main focus is on the modeling of carrier transport in the nanostructured gain medium, while the simulation of the optical cavity is covered at a more basic level. Specifically, the transfer matrix and finite difference methods for solving the one-dimensional Schrödinger equation and Schrödinger-Poisson system are discussed, providing the quantized states in the multiple-quantum-well active region. The modeling of the optical cavity is covered with a focus on basic waveguide resonator structures. Furthermore, various carrier transport simulation methods are discussed, ranging from basic empirical approaches to advanced self-consistent techniques. The methods include empirical rate equation and related Maxwell-Bloch equation approaches, self-consistent rate equation and ensemble Monte Carlo methods, as well as quantum transport approaches, in particular the density matrix and non-equilibrium Green's function formalism. The derived scattering rates and self-energies are generally valid for n-type devices based on one-dimensional quantum confinement, such as quantum well structures.« less

  4. Massively Parallel Simulations of Diffusion in Dense Polymeric Structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faulon, Jean-Loup, Wilcox, R.T.

    1997-11-01

    An original computational technique to generate close-to-equilibrium dense polymeric structures is proposed. Diffusion of small gases are studied on the equilibrated structures using massively parallel molecular dynamics simulations running on the Intel Teraflops (9216 Pentium Pro processors) and Intel Paragon(1840 processors). Compared to the current state-of-the-art equilibration methods this new technique appears to be faster by some orders of magnitude.The main advantage of the technique is that one can circumvent the bottlenecks in configuration space that inhibit relaxation in molecular dynamics simulations. The technique is based on the fact that tetravalent atoms (such as carbon and silicon) fit in themore » center of a regular tetrahedron and that regular tetrahedrons can be used to mesh the three-dimensional space. Thus, the problem of polymer equilibration described by continuous equations in molecular dynamics is reduced to a discrete problem where solutions are approximated by simple algorithms. Practical modeling applications include the constructing of butyl rubber and ethylene-propylene-dimer-monomer (EPDM) models for oxygen and water diffusion calculations. Butyl and EPDM are used in O-ring systems and serve as sealing joints in many manufactured objects. Diffusion coefficients of small gases have been measured experimentally on both polymeric systems, and in general the diffusion coefficients in EPDM are an order of magnitude larger than in butyl. In order to better understand the diffusion phenomena, 10, 000 atoms models were generated and equilibrated for butyl and EPDM. The models were submitted to a massively parallel molecular dynamics simulation to monitor the trajectories of the diffusing species.« less

  5. MCNP6 simulation of radiographs generated from megaelectron volt X-rays for characterizing a computed tomography system

    NASA Astrophysics Data System (ADS)

    Dooraghi, Alex A.; Tringe, Joseph W.

    2018-04-01

    To evaluate conventional munition, we simulated an x-ray computed tomography (CT) system for generating radiographs from nominal x-ray energies of 6 or 9 megaelectron volts (MeV). CT simulations, informed by measured data, allow for optimization of both system design and acquisition techniques necessary to enhance image quality. MCNP6 radiographic simulation tools were used to model ideal detector responses (DR) that assume either (1) a detector response proportional to photon flux (N) or (2) a detector response proportional to energy flux (E). As scatter may become significant with MeV x-ray systems, simulations were performed with and without the inclusion of object scatter. Simulations were compared against measurements of a cylindrical munition component principally composed of HMX, tungsten and aluminum encased in carbon fiber. Simulations and measurements used a 6 MeV peak energy x-ray spectrum filtered with 3.175 mm of tantalum. A detector response proportional to energy which includes object scatter agrees to within 0.6 % of the measured line integral of the linear attenuation coefficient. Exclusion of scatter increases the difference between measurement and simulation to 5 %. A detector response proportional to photon flux agrees to within 20 % when object scatter is included in the simulation and 27 % when object scatter is excluded.

  6. Flight simulation software at NASA Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Norlin, Ken A.

    1995-01-01

    The NASA Dryden Flight Research Center has developed a versatile simulation software package that is applicable to a broad range of fixed-wing aircraft. This package has evolved in support of a variety of flight research programs. The structure is designed to be flexible enough for use in batch-mode, real-time pilot-in-the-loop, and flight hardware-in-the-loop simulation. Current simulations operate on UNIX-based platforms and are coded with a FORTRAN shell and C support routines. This paper discusses the features of the simulation software design and some basic model development techniques. The key capabilities that have been included in the simulation are described. The NASA Dryden simulation software is in use at other NASA centers, within industry, and at several universities. The straightforward but flexible design of this well-validated package makes it especially useful in an engineering environment.

  7. Communication: Multiple atomistic force fields in a single enhanced sampling simulation

    NASA Astrophysics Data System (ADS)

    Hoang Viet, Man; Derreumaux, Philippe; Nguyen, Phuong H.

    2015-07-01

    The main concerns of biomolecular dynamics simulations are the convergence of the conformational sampling and the dependence of the results on the force fields. While the first issue can be addressed by employing enhanced sampling techniques such as simulated tempering or replica exchange molecular dynamics, repeating these simulations with different force fields is very time consuming. Here, we propose an automatic method that includes different force fields into a single advanced sampling simulation. Conformational sampling using three all-atom force fields is enhanced by simulated tempering and by formulating the weight parameters of the simulated tempering method in terms of the energy fluctuations, the system is able to perform random walk in both temperature and force field spaces. The method is first demonstrated on a 1D system and then validated by the folding of the 10-residue chignolin peptide in explicit water.

  8. Comparison of OH reactivity instruments in the atmosphere simulation chamber SAPHIR

    NASA Astrophysics Data System (ADS)

    Fuchs, Hendrik

    2016-04-01

    OH reactivity measurement has become an important measurement to constrain the total OH loss frequency in field experiments. Different techniques have been developed by various groups. They can be based on flow-tube or pump and probe techniques, which include direct OH detection by fluorescence, or on a comparative method, in which the OH loss of a reference species competes with the OH loss of trace gases in the sampled air. In order to ensure that these techniques deliver equivalent results, a comparison exercise was performed under controlled conditions. Nine OH reactivity instruments measured together in the atmosphere simulation chamber SAPHIR (volume 270 m3) during ten daylong experiments in October 2015 at ambient temperature (5 to 10° C) and pressure (990-1010 hPa). The chemical complexity of air mixtures in these experiments varied from CO in pure synthetic air to emissions from real plants and VOC/NOx mixtures representative of urban atmospheres. Potential differences between measurements were systematically investigated by changing the amount of reactants (including isoprene, monoterpenes and sesquiterpenes), water vapour, and nitrogen oxides. Some of the experiments also included the oxidation of reactants with ozone or hydroxyl radicals, in order to elaborate, if the presence of oxidation products leads to systematic differences between measurements of different instruments. Here we present first results of this comparison exercise.

  9. Chronology of DIC technique based on the fundamental mathematical modeling and dehydration impact.

    PubMed

    Alias, Norma; Saipol, Hafizah Farhah Saipan; Ghani, Asnida Che Abd

    2014-12-01

    A chronology of mathematical models for heat and mass transfer equation is proposed for the prediction of moisture and temperature behavior during drying using DIC (Détente Instantanée Contrôlée) or instant controlled pressure drop technique. DIC technique has the potential as most commonly used dehydration method for high impact food value including the nutrition maintenance and the best possible quality for food storage. The model is governed by the regression model, followed by 2D Fick's and Fourier's parabolic equation and 2D elliptic-parabolic equation in a rectangular slice. The models neglect the effect of shrinkage and radiation effects. The simulations of heat and mass transfer equations with parabolic and elliptic-parabolic types through some numerical methods based on finite difference method (FDM) have been illustrated. Intel®Core™2Duo processors with Linux operating system and C programming language have been considered as a computational platform for the simulation. Qualitative and quantitative differences between DIC technique and the conventional drying methods have been shown as a comparative.

  10. Real-time holographic deconvolution techniques for one-way image transmission through an aberrating medium: characterization, modeling, and measurements.

    PubMed

    Haji-Saeed, B; Sengupta, S K; Testorf, M; Goodhue, W; Khoury, J; Woods, C L; Kierstead, J

    2006-05-10

    We propose and demonstrate a new photorefractive real-time holographic deconvolution technique for adaptive one-way image transmission through aberrating media by means of four-wave mixing. In contrast with earlier methods, which typically required various codings of the exact phase or two-way image transmission for correcting phase distortion, our technique relies on one-way image transmission through the use of exact phase information. Our technique can simultaneously correct both amplitude and phase distortions. We include several forms of image degradation, various test cases, and experimental results. We characterize the performance as a function of the input beam ratios for four metrics: signal-to-noise ratio, normalized root-mean-square error, edge restoration, and peak-to-total energy ratio. In our characterization we use false-color graphic images to display the best beam-intensity ratio two-dimensional region(s) for each of these metrics. Test cases are simulated at the optimal values of the beam-intensity ratios. We demonstrate our results through both experiment and computer simulation.

  11. INTEGRITY - Integrated Human Exploration Mission Simulation Facility

    NASA Technical Reports Server (NTRS)

    Henninger, Donald L.

    2002-01-01

    It is proposed to develop a high-fidelity ground facility to carry out long-duration human exploration mission simulations. These would not be merely computer simulations - they would in fact comprise a series of actual missions that just happen to stay on earth. These missions would include all elements of an actual mission, using actual technologies that would be used for the real mission. These missions would also include such elements as extravehicular activities, robotic systems, telepresence and teleoperation, surface drilling technology-all using a simulated planetary landscape. A sequence of missions would be defined that get progressively longer and more robust, perhaps a series of five or six missions over a span of 10 to 15 years ranging in duration from 180 days up to 1000 days. This high-fidelity ground facility would operate hand-in-hand with a host of other terrestrial analog sites such as the Antarctic, Haughton Crater, and the Arizona desert. Of course, all of these analog mission simulations will be conducted here on earth in 1-g, and NASA will still need the Shuttle and ISS to carry out all the microgravity and hypogravity science experiments and technology validations. The proposed missions would have sufficient definition such that definitive requirements could be derived from them to serve as direction for all the program elements of the mission. Additionally, specific milestones would be established for the "launch" date of each mission so that R&D programs would have both good requirements and solid milestones from which to .build their implementation plans. Mission aspects that could not be directly incorporated into the ground facility would be simulated via software. New management techniques would be developed for evaluation in this ground test facility program. These new techniques would have embedded metrics which would allow them to be continuously evaluated and adjusted so that by the time the sequence of missions is completed, the best management techniques will have been developed, implemented, and validated. A trained cadre of managers experienced with a large, complex program would then be available.

  12. A novel biomechanical model assessing continuous orthodontic archwire activation

    PubMed Central

    Canales, Christopher; Larson, Matthew; Grauer, Dan; Sheats, Rose; Stevens, Clarke; Ko, Ching-Chang

    2013-01-01

    Objective The biomechanics of a continuous archwire inserted into multiple orthodontic brackets is poorly understood. The purpose of this research was to apply the birth-death technique to simulate insertion of an orthodontic wire and consequent transfer of forces to the dentition in an anatomically accurate model. Methods A digital model containing the maxillary dentition, periodontal ligament (PDL), and surrounding bone was constructed from human computerized tomography data. Virtual brackets were placed on four teeth (central and lateral incisors, canine and first premolar), and a steel archwire (0.019″ × 0.025″) with a 0.5 mm step bend to intrude the lateral incisor was virtually inserted into the bracket slots. Forces applied to the dentition and surrounding structures were simulated utilizing the birth-death technique. Results The goal of simulating a complete bracket-wire system on accurate anatomy including multiple teeth was achieved. Orthodontic force delivered by the wire-bracket interaction was: central incisor 19.1 N, lateral incisor 21.9 N, and canine 19.9 N. Loading the model with equivalent point forces showed a different stress distribution in the PDL. Conclusions The birth-death technique proved to be a useful biomechanical simulation method for placement of a continuous archwire in orthodontic brackets. The ability to view the stress distribution throughout proper anatomy and appliances advances understanding of orthodontic biomechanics. PMID:23374936

  13. COMPUTATIONAL MITRAL VALVE EVALUATION AND POTENTIAL CLINICAL APPLICATIONS

    PubMed Central

    Chandran, Krishnan B.; Kim, Hyunggun

    2014-01-01

    The mitral valve (MV) apparatus consists of the two asymmetric leaflets, the saddle-shaped annulus, the chordae tendineae, and the papillary muscles. MV function over the cardiac cycle involves complex interaction between the MV apparatus components for efficient blood circulation. Common diseases of the MV include valvular stenosis, regurgitation, and prolapse. MV repair is the most popular and most reliable surgical treatment for early MV pathology. One of the unsolved problems in MV repair is to predict the optimal repair strategy for each patient. Although experimental studies have provided valuable information to improve repair techniques, computational simulations are increasingly playing an important role in understanding the complex MV dynamics, particularly with the availability of patient-specific real-time imaging modalities. This work presents a review of computational simulation studies of MV function employing finite element (FE) structural analysis and fluid-structure interaction (FSI) approach reported in the literature to date. More recent studies towards potential applications of computational simulation approaches in the assessment of valvular repair techniques and potential pre-surgical planning of repair strategies are also discussed. It is anticipated that further advancements in computational techniques combined with the next generations of clinical imaging modalities will enable physiologically more realistic simulations. Such advancement in imaging and computation will allow for patient-specific, disease-specific, and case-specific MV evaluation and virtual prediction of MV repair. PMID:25134487

  14. Ground-to-Flight Handling Qualities Comparisons for a High Performance Airplane

    NASA Technical Reports Server (NTRS)

    Brandon, Jay M.; Glaab, Louis J.; Brown, Philip W.; Phillips, Michael R.

    1995-01-01

    A flight test program was conducted in conjunction with a ground-based piloted simulation study to enable a comparison of handling qualities ratings for a variety of maneuvers between flight and simulation of a modern high performance airplane. Specific objectives included an evaluation of pilot-induced oscillation (PIO) tendencies and a determination of maneuver types which result in either good or poor ground-to-flight pilot handling qualities ratings. A General Dynamics F-16XL aircraft was used for the flight evaluations, and the NASA Langley Differential Maneuvering Simulator was employed for the ground based evaluations. Two NASA research pilots evaluated both the airplane and simulator characteristics using tasks developed in the simulator. Simulator and flight tests were all conducted within approximately a one month time frame. Maneuvers included numerous fine tracking evaluations at various angles of attack, load factors and speed ranges, gross acquisitions involving longitudinal and lateral maneuvering, roll angle captures, and an ILS task with a sidestep to landing. Overall results showed generally good correlation between ground and flight for PIO tendencies and general handling qualities comments. Differences in pilot technique used in simulator evaluations and effects of airplane accelerations and motions are illustrated.

  15. POD/MAC-Based Modal Basis Selection for a Reduced Order Nonlinear Response Analysis

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2007-01-01

    A feasibility study was conducted to explore the applicability of a POD/MAC basis selection technique to a nonlinear structural response analysis. For the case studied the application of the POD/MAC technique resulted in a substantial improvement of the reduced order simulation when compared to a classic approach utilizing only low frequency modes present in the excitation bandwidth. Further studies are aimed to expand application of the presented technique to more complex structures including non-planar and two-dimensional configurations. For non-planar structures the separation of different displacement components may not be necessary or desirable.

  16. Concepts and algorithms for terminal-area traffic management

    NASA Technical Reports Server (NTRS)

    Erzberger, H.; Chapel, J. D.

    1984-01-01

    The nation's air-traffic-control system is the subject of an extensive modernization program, including the planned introduction of advanced automation techniques. This paper gives an overview of a concept for automating terminal-area traffic management. Four-dimensional (4D) guidance techniques, which play an essential role in the automated system, are reviewed. One technique, intended for on-board computer implementation, is based on application of optimal control theory. The second technique is a simplified approach to 4D guidance intended for ground computer implementation. It generates advisory messages to help the controller maintain scheduled landing times of aircraft not equipped with on-board 4D guidance systems. An operational system for the second technique, recently evaluated in a simulation, is also described.

  17. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.

  18. Reconstructing gravitational wave source parameters via direct comparisons to numerical relativity I: Method

    NASA Astrophysics Data System (ADS)

    Lange, Jacob; O'Shaughnessy, Richard; Healy, James; Lousto, Carlos; Shoemaker, Deirdre; Lovelace, Geoffrey; Scheel, Mark; Ossokine, Serguei

    2016-03-01

    In this talk, we describe a procedure to reconstruct the parameters of sufficiently massive coalescing compact binaries via direct comparison with numerical relativity simulations. For sufficiently massive sources, existing numerical relativity simulations are long enough to cover the observationally accessible part of the signal. Due to the signal's brevity, the posterior parameter distribution it implies is broad, simple, and easily reconstructed from information gained by comparing to only the sparse sample of existing numerical relativity simulations. We describe how followup simulations can corroborate and improve our understanding of a detected source. Since our method can include all physics provided by full numerical relativity simulations of coalescing binaries, it provides a valuable complement to alternative techniques which employ approximations to reconstruct source parameters. Supported by NSF Grant PHY-1505629.

  19. The N/Rev phenomenon in simulating a blade-element rotor system

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. E.

    1983-01-01

    When a simulation model produces frequencies that are beyond the bandwidth of a discrete implementation, anomalous frequencies appear within the bandwidth. Such is the case with blade element models of rotor systems, which are used in the real time, man in the loop simulation environment. Steady state, high frequency harmonics generated by these models, whether aliased or not, obscure piloted helicopter simulation responses. Since these harmonics are attenuated in actual rotorcraft (e.g., because of structural damping), a faithful environment representation for handling qualities purposes may be created from the original model by using certain filtering techniques, as outlined here. These include harmonic consideration, conventional filtering, and decontamination. The process of decontamination is of special interest because frequencies of importance to simulation operation are not attenuated, whereas superimposed aliased harmonics are.

  20. Pattern-recognition techniques applied to performance monitoring of the DSS 13 34-meter antenna control assembly

    NASA Technical Reports Server (NTRS)

    Mellstrom, J. A.; Smyth, P.

    1991-01-01

    The results of applying pattern recognition techniques to diagnose fault conditions in the pointing system of one of the Deep Space network's large antennas, the DSS 13 34-meter structure, are discussed. A previous article described an experiment whereby a neural network technique was used to identify fault classes by using data obtained from a simulation model of the Deep Space Network (DSN) 70-meter antenna system. Described here is the extension of these classification techniques to the analysis of real data from the field. The general architecture and philosophy of an autonomous monitoring paradigm is described and classification results are discussed and analyzed in this context. Key features of this approach include a probabilistic time-varying context model, the effective integration of signal processing and system identification techniques with pattern recognition algorithms, and the ability to calibrate the system given limited amounts of training data. Reported here are recognition accuracies in the 97 to 98 percent range for the particular fault classes included in the experiments.

  1. Plane-Wave DFT Methods for Chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bylaska, Eric J.

    A detailed description of modern plane-wave DFT methods and software (contained in the NWChem package) are described that allow for both geometry optimization and ab initio molecular dynamics simulations. Significant emphasis is placed on aspects of these methods that are of interest to computational chemists and useful for simulating chemistry, including techniques for calculating charged systems, exact exchange (i.e. hybrid DFT methods), and highly efficient AIMD/MM methods. Sample applications on the structure of the goethite+water interface and the hydrolysis of nitroaromatic molecules are described.

  2. Simulation analysis of the transparency of cornea and sclera

    NASA Astrophysics Data System (ADS)

    Yang, Chih-Yao; Tseng, Snow H.

    2017-02-01

    Both consist of collagen fibrils, sclera is opaque whereas cornea is transparent for optical wavelengths. By employing the pseudospectral time-domain (PSTD) simulation technique, we model light impinging upon cornea and sclera, respectively. To analyze the scattering characteristics of light, the cornea and sclera are modeled by different sizes and arrangements of the non-absorbing collagen fibrils. Various factors are analyzed, including the wavelength of incident light, the thickness of the scattering media, position of the collagen fibrils, size distribution of the fibrils.

  3. Wind Factor Simulation Model: User’s Manual.

    DTIC Science & Technology

    1980-04-01

    computer program documentation; com- puterized simulation; equivalent headwind technique; great circle; great circle distance; great circle equation ; great... equation of a great circle. Program listing and flow chart are included. iv UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE(WIh.n Date EnItrd) USER’S...THE EQUATOR . 336 C 337 NTRIFG = 0 338 C 339 C END OF FUNCTION ICONV I 1. RETURN TO MAIN PROGRAM . 340 C 42 341 RETURN 34? C 343 C 344 C 345 C * PART II

  4. Particle identification using the time-over-threshold measurements in straw tube detectors

    NASA Astrophysics Data System (ADS)

    Jowzaee, S.; Fioravanti, E.; Gianotti, P.; Idzik, M.; Korcyl, G.; Palka, M.; Przyborowski, D.; Pysz, K.; Ritman, J.; Salabura, P.; Savrie, M.; Smyrski, J.; Strzempek, P.; Wintz, P.

    2013-08-01

    The identification of charged particles based on energy losses in straw tube detectors has been simulated. The response of a new front-end chip developed for the PANDA straw tube tracker was implemented in the simulations and corrections for track distance to sense wire were included. Separation power for p - K, p - π and K - π pairs obtained using the time-over-threshold technique was compared with the one based on the measurement of collected charge.

  5. Space construction base control system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Aspects of an attitude control system were studied and developed for a large space base that is structurally flexible and whose mass properties change rather dramatically during its orbital lifetime. Topics of discussion include the following: (1) space base orbital pointing and maneuvering; (2) angular momentum sizing of actuators; (3) momentum desaturation selection and sizing; (4) multilevel control technique applied to configuration one; (5) one-dimensional model simulation; (6) N-body discrete coordinate simulation; (7) structural analysis math model formulation; and (8) discussion of control problems and control methods.

  6. Current topics in shock waves; Proceedings of the International Symposium on Shock Waves and Shock Tubes, 17th, Lehigh University, Bethlehem, PA, July 17-21, 1989

    NASA Astrophysics Data System (ADS)

    Kim, Yong W.

    Various papers on shock waves are presented. The general topics addressed include: shock formation, focusing, and implosion; shock reflection and diffraction; turbulence; laser-produced plasmas and waves; ionization and shock-plasma interaction; chemical kinetics, pyrolysis, and soot formation; experimental facilities, techniques, and applications; ignition of detonation and combustion; particle entrainment and shock propagation through particle suspension; boundary layers and blast simulation; computational methods and numerical simulation.

  7. Graphene symmetry-breaking with molecular adsorbates: modeling and experiment

    NASA Astrophysics Data System (ADS)

    Groce, M. A.; Hawkins, M. K.; Wang, Y. L.; Cullen, W. G.; Einstein, T. L.

    2012-02-01

    Graphene's structure and electronic properties provide a framework for understanding molecule-substrate interactions and developing techniques for band gap engineering. Controlled deposition of molecular adsorbates can create superlattices which break the degeneracy of graphene's two-atom unit cell, opening a band gap. We simulate scanning tunneling microscopy and spectroscopy measurements for a variety of organic molecule/graphene systems, including pyridine, trimesic acid, and isonicotinic acid, based on density functional theory calculations using VASP. We also compare our simulations to ultra-high vacuum STM and STS results.

  8. Techniques and resources for storm-scale numerical weather prediction

    NASA Technical Reports Server (NTRS)

    Droegemeier, Kelvin; Grell, Georg; Doyle, James; Soong, Su-Tzai; Skamarock, William; Bacon, David; Staniforth, Andrew; Crook, Andrew; Wilhelmson, Robert

    1993-01-01

    The topics discussed include the following: multiscale application of the 5th-generation PSU/NCAR mesoscale model, the coupling of nonhydrostatic atmospheric and hydrostatic ocean models for air-sea interaction studies; a numerical simulation of cloud formation over complex topography; adaptive grid simulations of convection; an unstructured grid, nonhydrostatic meso/cloud scale model; efficient mesoscale modeling for multiple scales using variable resolution; initialization of cloud-scale models with Doppler radar data; and making effective use of future computing architectures, networks, and visualization software.

  9. NASTRAN thermal analyzer: Theory and application including a guide to modeling engineering problems, volume 2. [sample problem library guide

    NASA Technical Reports Server (NTRS)

    Jackson, C. E., Jr.

    1977-01-01

    A sample problem library containing 20 problems covering most facets of Nastran Thermal Analyzer modeling is presented. Areas discussed include radiative interchange, arbitrary nonlinear loads, transient temperature and steady-state structural plots, temperature-dependent conductivities, simulated multi-layer insulation, and constraint techniques. The use of the major control options and important DMAP alters is demonstrated.

  10. Establishment of a Vaporous Hydrogen Peroxide Bio-Decontamination Capability

    DTIC Science & Technology

    2007-02-01

    of Colorado at Denver and Health Sciences Center. There he utilised mass spectrometry to investigate the biochemical pathways involved in lipid... techniques (NMR, GC). Since then she has worked in a variety of areas including: (a) computer simulation of vapour dispersion for early warning to...to inactivate biological agents such as B. anthracis and these include beta-propiolactone, chlorine dioxide, ethylene oxide, propylene oxide, ozone

  11. Instability in Rotating Machinery

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The proceedings contain 45 papers on a wide range of subjects including flow generated instabilities in fluid flow machines, cracked shaft detection, case histories of instability phenomena in compressors, turbines, and pumps, vibration control in turbomachinery (including antiswirl techniques), and the simulation and estimation of destabilizing forces in rotating machines. The symposium was held to serve as an update on the understanding and control of rotating machinery instability problems.

  12. A Limited-Vocabulary, Multi-Speaker Automatic Isolated Word Recognition System.

    ERIC Educational Resources Information Center

    Paul, James E., Jr.

    Techniques for automatic recognition of isolated words are investigated, and a computer simulation of a word recognition system is effected. Considered in detail are data acquisition and digitizing, word detection, amplitude and time normalization, short-time spectral estimation including spectral windowing, spectral envelope approximation,…

  13. GPS Based Spacecraft Attitude Determination

    DTIC Science & Technology

    1993-09-30

    AD-A271 734 GPS Based Spacecraft Attitude Determination Final Report for October 1992- September 1993 to the Naval Research Laboratory Prepared by .F...ethods ....................................................................... 7 4. Spacecraft Attitude and Orbit Determination... attitude determination techniques to near-Earth spacecraft. The areas addressed include solution algorithms, simulation of the spacecraft and

  14. Inverted File Compression through Document Identifier Reassignment.

    ERIC Educational Resources Information Center

    Shieh, Wann-Yun; Chen, Tien-Fu; Shann, Jean Jyh-Jiun; Chung, Chung-Ping

    2003-01-01

    Discusses the use of inverted files in information retrieval systems and proposes a document identifier reassignment method to reduce the average gap values in an inverted file. Highlights include the d-gap technique; document similarity; heuristic algorithms; file compression; and performance evaluation from a simulation environment. (LRW)

  15. Conceptual Design and Neutronics Analyses of a Fusion Reactor Blanket Simulation Facility

    DTIC Science & Technology

    1986-01-01

    Laboratory (LLL) ORNL Oak Ridge National Laboratory PPPL Princeton Plasma Physics Laboratory RSIC Reactor Shielding Information Center (at ORNL) SS...Module (LBM) to be placed in the TFTR at PPPL . Jassby et al. describe the program, including design, manufacturing techniques. neutronics analyses, and

  16. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  17. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  18. Sensing Methods for Detecting Analog Television Signals

    NASA Astrophysics Data System (ADS)

    Rahman, Mohammad Azizur; Song, Chunyi; Harada, Hiroshi

    This paper introduces a unified method of spectrum sensing for all existing analog television (TV) signals including NTSC, PAL and SECAM. We propose a correlation based method (CBM) with a single reference signal for sensing any analog TV signals. In addition we also propose an improved energy detection method. The CBM approach has been implemented in a hardware prototype specially designed for participating in Singapore TV white space (WS) test trial conducted by Infocomm Development Authority (IDA) of the Singapore government. Analytical and simulation results of the CBM method will be presented in the paper, as well as hardware testing results for sensing various analog TV signals. Both AWGN and fading channels will be considered. It is shown that the theoretical results closely match with those from simulations. Sensing performance of the hardware prototype will also be presented in fading environment by using a fading simulator. We present performance of the proposed techniques in terms of probability of false alarm, probability of detection, sensing time etc. We also present a comparative study of the various techniques.

  19. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles.

    PubMed

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F; Perez, Danny

    2017-10-21

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  20. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles

    NASA Astrophysics Data System (ADS)

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny

    2017-10-01

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  1. Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.

    PubMed

    Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G

    2018-06-01

    This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.

  2. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  3. Research on a simulation-based ship production support system for middle-sized shipbuilding companies

    NASA Astrophysics Data System (ADS)

    Song, Young Joo; Woo, Jong Hun; Shin, Jong Gye

    2009-12-01

    Today, many middle-sized shipbuilding companies in Korea are experiencing strong competition from shipbuilding companies in other nations. This competition is particularly affecting small- and middle-sized shipyards, rather than the major shipyards that have their own support systems and development capabilities. The acquisition of techniques that would enable maximization of production efficiency and minimization of the gap between planning and execution would increase the competitiveness of small- and middle-sized Korean shipyards. In this paper, research on a simulation-based support system for ship production management, which can be applied to the shipbuilding processes of middle-sized shipbuilding companies, is presented. The simulation research includes layout optimization, load balancing, work stage operation planning, block logistics, and integrated material management. Each item is integrated into a network system with a value chain that includes all shipbuilding processes.

  4. Solving search problems by strongly simulating quantum circuits

    PubMed Central

    Johnson, T. H.; Biamonte, J. D.; Clark, S. R.; Jaksch, D.

    2013-01-01

    Simulating quantum circuits using classical computers lets us analyse the inner workings of quantum algorithms. The most complete type of simulation, strong simulation, is believed to be generally inefficient. Nevertheless, several efficient strong simulation techniques are known for restricted families of quantum circuits and we develop an additional technique in this article. Further, we show that strong simulation algorithms perform another fundamental task: solving search problems. Efficient strong simulation techniques allow solutions to a class of search problems to be counted and found efficiently. This enhances the utility of strong simulation methods, known or yet to be discovered, and extends the class of search problems known to be efficiently simulable. Relating strong simulation to search problems also bounds the computational power of efficiently strongly simulable circuits; if they could solve all problems in P this would imply that all problems in NP and #P could be solved in polynomial time. PMID:23390585

  5. Some aspects of robotics calibration, design and control

    NASA Technical Reports Server (NTRS)

    Tawfik, Hazem

    1990-01-01

    The main objective is to introduce techniques in the areas of testing and calibration, design, and control of robotic systems. A statistical technique is described that analyzes a robot's performance and provides quantitative three-dimensional evaluation of its repeatability, accuracy, and linearity. Based on this analysis, a corrective action should be taken to compensate for any existing errors and enhance the robot's overall accuracy and performance. A comparison between robotics simulation software packages that were commercially available (SILMA, IGRIP) and that of Kennedy Space Center (ROBSIM) is also included. These computer codes simulate the kinematics and dynamics patterns of various robot arm geometries to help the design engineer in sizing and building the robot manipulator and control system. A brief discussion on an adaptive control algorithm is provided.

  6. Virtual reality technique to assist measurement of degree of shaking of two minarets of an ancient building

    NASA Astrophysics Data System (ADS)

    Homainejad, Amir S.; Satari, Mehran

    2000-05-01

    VR is possible which brings users to the reality by computer and VE is a simulated world which takes users to any points and directions of the object. VR and VE can be very useful if accurate and precise data are sued, and allows users to work with realistic model. Photogrammetry is a technique which is able to collect and provide accurate and precise data for building 3D model in a computer. Data can be collected from various sensor and cameras, and methods of data collector are vary based on the method of image acquiring. Indeed VR includes real-time graphics, 3D model, and display and it has application in the entertainment industry, flight simulators, industrial design.

  7. Efficient techniques for wave-based sound propagation in interactive applications

    NASA Astrophysics Data System (ADS)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data-driven, rotating or time-varying directivity function at runtime. Unlike previous approaches, the listener directivity approach can be used to compute spatial audio (3D audio) for a moving, rotating listener at interactive rates. Lastly, we propose an efficient GPU-based time-domain solver for the wave equation that enables wave simulation up to the mid-frequency range in tens of minutes on a desktop computer. It is demonstrated that by carefully mapping all the components of the wave simulator to match the parallel processing capabilities of the graphics processors, significant improvement in performance can be achieved compared to the CPU-based simulators, while maintaining numerical accuracy. We validate these techniques with offline numerical simulations and measured data recorded in an outdoor scene. We present results of preliminary user evaluations conducted to study the impact of these techniques on user's immersion in virtual environment. We have integrated these techniques with the Half-Life 2 game engine, Oculus Rift head-mounted display, and Xbox game controller to enable users to experience high-quality acoustics effects and spatial audio in the virtual environment.

  8. Nonholonomic Hamiltonian Method for Meso-macroscale Simulations of Reacting Shocks

    NASA Astrophysics Data System (ADS)

    Fahrenthold, Eric; Lee, Sangyup

    2015-06-01

    The seamless integration of macroscale, mesoscale, and molecular scale models of reacting shock physics has been hindered by dramatic differences in the model formulation techniques normally used at different scales. In recent research the authors have developed the first unified discrete Hamiltonian approach to multiscale simulation of reacting shock physics. Unlike previous work, the formulation employs reacting themomechanical Hamiltonian formulations at all scales, including the continuum. Unlike previous work, the formulation employs a nonholonomic modeling approach to systematically couple the models developed at all scales. Example applications of the method show meso-macroscale shock to detonation simulations in nitromethane and RDX. Research supported by the Defense Threat Reduction Agency.

  9. Formulation of consumables management models: Consumables analysis/crew simulator interface requirements

    NASA Technical Reports Server (NTRS)

    Zamora, M. A.

    1977-01-01

    Consumables analysis/crew training simulator interface requirements were defined. Two aspects were investigated: consumables analysis support techniques to crew training simulator for advanced spacecraft programs, and the applicability of the above techniques to the crew training simulator for the space shuttle program in particular.

  10. Moving magnets in a micromagnetic finite-difference framework

    NASA Astrophysics Data System (ADS)

    Rissanen, Ilari; Laurson, Lasse

    2018-05-01

    We present a method and an implementation for smooth linear motion in a finite-difference-based micromagnetic simulation code, to be used in simulating magnetic friction and other phenomena involving moving microscale magnets. Our aim is to accurately simulate the magnetization dynamics and relative motion of magnets while retaining high computational speed. To this end, we combine techniques for fast scalar potential calculation and cubic b-spline interpolation, parallelizing them on a graphics processing unit (GPU). The implementation also includes the possibility of explicitly simulating eddy currents in the case of conducting magnets. We test our implementation by providing numerical examples of stick-slip motion of thin films pulled by a spring and the effect of eddy currents on the switching time of magnetic nanocubes.

  11. Seafloor identification in sonar imagery via simulations of Helmholtz equations and discrete optimization

    NASA Astrophysics Data System (ADS)

    Engquist, Björn; Frederick, Christina; Huynh, Quyen; Zhou, Haomin

    2017-06-01

    We present a multiscale approach for identifying features in ocean beds by solving inverse problems in high frequency seafloor acoustics. The setting is based on Sound Navigation And Ranging (SONAR) imaging used in scientific, commercial, and military applications. The forward model incorporates multiscale simulations, by coupling Helmholtz equations and geometrical optics for a wide range of spatial scales in the seafloor geometry. This allows for detailed recovery of seafloor parameters including material type. Simulated backscattered data is generated using numerical microlocal analysis techniques. In order to lower the computational cost of the large-scale simulations in the inversion process, we take advantage of a pre-computed library of representative acoustic responses from various seafloor parameterizations.

  12. Modeling the human mental lexicon with self-organizing feature maps

    NASA Astrophysics Data System (ADS)

    Wittenburg, Peter; Frauenfelder, Uli H.

    1992-10-01

    Recent efforts to model the remarkable ability of humans to recognize speech and words are described. Different techniques including the use of neural nets for representing phonological similarity between words in the lexicon with self organizing algorithms are discussed. Simulations using the standard Kohonen algorithm are presented to illustrate some problems confronted with this technique in modeling similarity relations of form in the human mental lexicon. Alternative approaches that can potentially deal with some of these limitations are sketched.

  13. The Performance of A Sampled Data Delay Lock Loop Implemented with a Kalman Loop Filter.

    DTIC Science & Technology

    1980-01-01

    que for analysis is computer simulation. Other techniques include state variable techniques and z-transform methods. Since the Kalman filter is linear...LOGIC NOT SHOWN Figure 2. Block diagram of the sampled data delay lock loop (SDDLL) Es A/ A 3/A/ Figure 3. Sampled error voltage ( Es ) as a function of...from a sum of two components. The first component is the previous filtered es - timate advanced one step forward by the state transition matrix. The 8

  14. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system has been created. The qualitative model describes the effects of seal failures on the system steady-state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  15. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system was created. The qualitative model describes the effects of seal failures on the system steady state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  16. Advances in parameter estimation techniques applied to flexible structures

    NASA Technical Reports Server (NTRS)

    Maben, Egbert; Zimmerman, David C.

    1994-01-01

    In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.

  17. Applications of Computational Methods for Dynamic Stability and Control Derivatives

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Spence, Angela M.

    2004-01-01

    Initial steps in the application o f a low-order panel method computational fluid dynamic (CFD) code to the calculation of aircraft dynamic stability and control (S&C) derivatives are documented. Several capabilities, unique to CFD but not unique to this particular demonstration, are identified and demonstrated in this paper. These unique capabilities complement conventional S&C techniques and they include the ability to: 1) perform maneuvers without the flow-kinematic restrictions and support interference commonly associated with experimental S&C facilities, 2) easily simulate advanced S&C testing techniques, 3) compute exact S&C derivatives with uncertainty propagation bounds, and 4) alter the flow physics associated with a particular testing technique from those observed in a wind or water tunnel test in order to isolate effects. Also presented are discussions about some computational issues associated with the simulation of S&C tests and selected results from numerous surface grid resolution studies performed during the course of the study.

  18. Numerical aerodynamic simulation facility. [for flows about three-dimensional configurations

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Hathaway, A. W.

    1978-01-01

    Critical to the advancement of computational aerodynamics capability is the ability to simulate flows about three-dimensional configurations that contain both compressible and viscous effects, including turbulence and flow separation at high Reynolds numbers. Analyses were conducted of two solution techniques for solving the Reynolds averaged Navier-Stokes equations describing the mean motion of a turbulent flow with certain terms involving the transport of turbulent momentum and energy modeled by auxiliary equations. The first solution technique is an implicit approximate factorization finite-difference scheme applied to three-dimensional flows that avoids the restrictive stability conditions when small grid spacing is used. The approximate factorization reduces the solution process to a sequence of three one-dimensional problems with easily inverted matrices. The second technique is a hybrid explicit/implicit finite-difference scheme which is also factored and applied to three-dimensional flows. Both methods are applicable to problems with highly distorted grids and a variety of boundary conditions and turbulence models.

  19. A study of two statistical methods as applied to shuttle solid rocket booster expenditures

    NASA Technical Reports Server (NTRS)

    Perlmutter, M.; Huang, Y.; Graves, M.

    1974-01-01

    The state probability technique and the Monte Carlo technique are applied to finding shuttle solid rocket booster expenditure statistics. For a given attrition rate per launch, the probable number of boosters needed for a given mission of 440 launches is calculated. Several cases are considered, including the elimination of the booster after a maximum of 20 consecutive launches. Also considered is the case where the booster is composed of replaceable components with independent attrition rates. A simple cost analysis is carried out to indicate the number of boosters to build initially, depending on booster costs. Two statistical methods were applied in the analysis: (1) state probability method which consists of defining an appropriate state space for the outcome of the random trials, and (2) model simulation method or the Monte Carlo technique. It was found that the model simulation method was easier to formulate while the state probability method required less computing time and was more accurate.

  20. Propulsion simulation for magnetically suspended wind tunnel models

    NASA Technical Reports Server (NTRS)

    Joshi, Prakash B.; Beerman, Henry P.; Chen, James; Krech, Robert H.; Lintz, Andrew L.; Rosen, David I.

    1990-01-01

    The feasibility of simulating propulsion-induced aerodynamic effects on scaled aircraft models in wind tunnels employing Magnetic Suspension and Balance Systems. The investigation concerned itself with techniques of generating exhaust jets of appropriate characteristics. The objectives were to: (1) define thrust and mass flow requirements of jets; (2) evaluate techniques for generating propulsive gas within volume limitations imposed by magnetically-suspended models; (3) conduct simple diagnostic experiments for techniques involving new concepts; and (4) recommend experiments for demonstration of propulsion simulation techniques. Various techniques of generating exhaust jets of appropriate characteristics were evaluated on scaled aircraft models in wind tunnels with MSBS. Four concepts of remotely-operated propulsion simulators were examined. Three conceptual designs involving innovative adaptation of convenient technologies (compressed gas cylinders, liquid, and solid propellants) were developed. The fourth innovative concept, namely, the laser-assisted thruster, which can potentially simulate both inlet and exhaust flows, was found to require very high power levels for small thrust levels.

  1. Estimation variance bounds of importance sampling simulations in digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  2. Novel Airway Training Tool that Simulates Vomiting: Suction-Assisted Laryngoscopy Assisted Decontamination (SALAD) System.

    PubMed

    DuCanto, James; Serrano, Karen D; Thompson, Ryan J

    2017-01-01

    We present a novel airway simulation tool that recreates the dynamic challenges associated with emergency airways. The Suction-Assisted Laryngoscopy Assisted Decontamination (SALAD) simulation system trains providers to use suction to manage emesis and bleeding complicating intubation. We modified a standard difficult-airway mannequin head (Nasco, Ft. Atkinson, WI) with hardware-store equipment to enable simulation of vomiting or hemorrhage during intubation. A pre- and post-survey was used to assess the effectiveness of the SALAD simulator. We used a 1-5 Likert scale to assess confidence in managing the airway of a vomiting patient and comfort with suction techniques before and after the training exercise. Forty learners participated in the simulation, including emergency physicians, anesthesiologists, paramedics, respiratory therapists, and registered nurses. The average Likert score of confidence in managing the airway of a vomiting or hemorrhaging patient pre-session was 3.10±0.49, and post-session 4.13±0.22. The average score of self-perceived skill with suction techniques in the airway scenario pre-session was 3.30±0.43, and post-session 4.03±0.26. The average score for usefulness of the session was 4.68±0.15, and the score for realism of the simulator was 4.65±0.17. A training session with the SALAD simulator improved trainee's confidence in managing the airway of a vomiting or hemorrhaging patient. The SALAD simulation system recreates the dynamic challenges associated with emergency airways and holds promise as an airway training tool.

  3. Wavelet-Bayesian inference of cosmic strings embedded in the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    McEwen, J. D.; Feeney, S. M.; Peiris, H. V.; Wiaux, Y.; Ringeval, C.; Bouchet, F. R.

    2017-12-01

    Cosmic strings are a well-motivated extension to the standard cosmological model and could induce a subdominant component in the anisotropies of the cosmic microwave background (CMB), in addition to the standard inflationary component. The detection of strings, while observationally challenging, would provide a direct probe of physics at very high-energy scales. We develop a framework for cosmic string inference from observations of the CMB made over the celestial sphere, performing a Bayesian analysis in wavelet space where the string-induced CMB component has distinct statistical properties to the standard inflationary component. Our wavelet-Bayesian framework provides a principled approach to compute the posterior distribution of the string tension Gμ and the Bayesian evidence ratio comparing the string model to the standard inflationary model. Furthermore, we present a technique to recover an estimate of any string-induced CMB map embedded in observational data. Using Planck-like simulations, we demonstrate the application of our framework and evaluate its performance. The method is sensitive to Gμ ∼ 5 × 10-7 for Nambu-Goto string simulations that include an integrated Sachs-Wolfe contribution only and do not include any recombination effects, before any parameters of the analysis are optimized. The sensitivity of the method compares favourably with other techniques applied to the same simulations.

  4. Outdoor Biology Instructional Strategies Trial Edition. Set I.

    ERIC Educational Resources Information Center

    Fairwell, Kay, Ed.; And Others

    The Outdoor Biology Instructional Strategies (OBIS) Trial Edition Set I contains 24 varied activities which make use of crafts, simulations, and basic investigative techniques to provide introductory learning experiences in outdoor biology for children aged 10 to 15. The individual water-resistant folio for each activity includes biological…

  5. APPLICATION AND EVALUATION OF CMAQ IN THE UNITED STATES: AIR QUALITY FORECASTING AND RETROSPECTIVE MODELING

    EPA Science Inventory

    Presentation slides provide background on model evaluation techniques. Also included in the presentation is an operational evaluation of 2001 Community Multiscale Air Quality (CMAQ) annual simulation, and an evaluation of PM2.5 for the CMAQ air quality forecast (AQF) ...

  6. Microwave Diffraction Techniques from Macroscopic Crystal Models

    ERIC Educational Resources Information Center

    Murray, William Henry

    1974-01-01

    Discusses the construction of a diffractometer table and four microwave models which are built of styrofoam balls with implanted metallic reflecting spheres and designed to simulate the structures of carbon (graphite structure), sodium chloride, tin oxide, and palladium oxide. Included are samples of Bragg patterns and computer-analysis results.…

  7. An overview of the essential differences and similarities of system identification techniques

    NASA Technical Reports Server (NTRS)

    Mehra, Raman K.

    1991-01-01

    Information is given in the form of outlines, graphs, tables and charts. Topics include system identification, Bayesian statistical decision theory, Maximum Likelihood Estimation, identification methods, structural mode identification using a stochastic realization algorithm, and identification results regarding membrane simulations and X-29 flutter flight test data.

  8. The Geography of Connection: Bringing the World to Students.

    ERIC Educational Resources Information Center

    Black, Mary S.

    2000-01-01

    Discusses strategies used by two teachers for teaching geography to at-risk students to connect the subject matter to the student's lives. Includes techniques such as integrating music, art, language, employing simulations when teaching, using current events to improve students' reading skills, and utilizing computer technology. (CMK)

  9. Workshop on Cosmogenic Nuclides

    NASA Technical Reports Server (NTRS)

    Reedy, R. C. (Editor); Englert, P. (Editor)

    1986-01-01

    Abstracts of papers presented at the Workshop on Cosmogenic Nuclides are compiled. The major topic areas covered include: new techniques for measuring nuclides such as tandem accelerator and resonance mass spectrometry; solar modulation of cosmic rays; pre-irradiation histories of extraterrestrial materials; terrestrial studies; simulations and cross sections; nuclide production rate calculations; and meteoritic nuclides.

  10. Research in Distance Education: A System Modeling Approach.

    ERIC Educational Resources Information Center

    Saba, Farhad; Twitchell, David

    This demonstration of the use of a computer simulation research method based on the System Dynamics modeling technique for studying distance education reviews research methods in distance education, including the broad categories of conceptual and case studies, and presents a rationale for the application of systems research in this area. The…

  11. From force-fields to photons: MD simulations of dye-labeled nucleic acids and Monte Carlo modeling of FRET

    NASA Astrophysics Data System (ADS)

    Goldner, Lori

    2012-02-01

    Fluorescence resonance energy transfer (FRET) is a powerful technique for understanding the structural fluctuations and transformations of RNA, DNA and proteins. Molecular dynamics (MD) simulations provide a window into the nature of these fluctuations on a different, faster, time scale. We use Monte Carlo methods to model and compare FRET data from dye-labeled RNA with what might be predicted from the MD simulation. With a few notable exceptions, the contribution of fluorophore and linker dynamics to these FRET measurements has not been investigated. We include the dynamics of the ground state dyes and linkers in our study of a 16mer double-stranded RNA. Water is included explicitly in the simulation. Cyanine dyes are attached at either the 3' or 5' ends with a 3 carbon linker, and differences in labeling schemes are discussed.[4pt] Work done in collaboration with Peker Milas, Benjamin D. Gamari, and Louis Parrot.

  12. Human Simulators and Standardized Patients to Teach Difficult Conversations to Interprofessional Health Care Teams

    PubMed Central

    Zimmerman, Christine; Kennedy, Christopher; Schremmer, Robert; Smith, Katharine V.

    2010-01-01

    Objective To design and implement a demonstration project to teach interprofessional teams how to recognize and engage in difficult conversations with patients. Design Interdisciplinary teams consisting of pharmacy students and residents, student nurses, and medical residents responded to preliminary questions regarding difficult conversations, listened to a brief discussion on difficult conversations; formed ad hoc teams and interacted with a standardized patient (mother) and a human simulator (child), discussing the infant's health issues, intimate partner violence, and suicidal thinking; and underwent debriefing. Assessment Participants evaluated the learning methods positively and a majority demonstrated knowledge gains. The project team also learned lessons that will help better design future programs, including an emphasis on simulations over lecture and the importance of debriefing on student learning. Drawbacks included the major time commitment for design and implementation, sustainability, and the lack of resources to replicate the program for all students. Conclusion Simulation is an effective technique to teach interprofessional teams how to engage in difficult conversations with patients. PMID:21088725

  13. Cart3D Simulations for the First AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael J.; Nemec, Marian

    2014-01-01

    Simulation results for the First AIAA Sonic Boom Prediction Workshop (LBW1) are presented using an inviscid, embedded-boundary Cartesian mesh method. The method employs adjoint-based error estimation and adaptive meshing to automatically determine resolution requirements of the computational domain. Results are presented for both mandatory and optional test cases. These include an axisymmetric body of revolution, a 69deg delta wing model and a complete model of the Lockheed N+2 supersonic tri-jet with V-tail and flow through nacelles. In addition to formal mesh refinement studies and examination of the adjoint-based error estimates, mesh convergence is assessed by presenting simulation results for meshes at several resolutions which are comparable in size to the unstructured grids distributed by the workshop organizers. Data provided includes both the pressure signals required by the workshop and information on code performance in both memory and processing time. Various enhanced techniques offering improved simulation efficiency will be demonstrated and discussed.

  14. Teaching sexual history-taking skills using the Sexual Events Classification System.

    PubMed

    Fidler, Donald C; Petri, Justin Daniel; Chapman, Mark

    2010-01-01

    The authors review the literature about educational programs for teaching sexual history-taking skills and describe novel techniques for teaching these skills. Psychiatric residents enrolled in a brief sexual history-taking course that included instruction on the Sexual Events Classification System, feedback on residents' video-recorded interviews with simulated patients, discussion of videos that simulated bad interviews, simulated patients, and a competency scoring form to score a video of a simulated interview. After the course, residents completed an anonymous survey to assess the usefulness of the experience. After the course, most residents felt more comfortable taking sexual histories. They described the Sexual Events Classification System and simulated interviews as practical methods for teaching sexual history-taking skills. The Sexual Events Classification System and simulated patient experiences may serve as a practical model for teaching sexual history-taking skills to general psychiatric residents.

  15. Self-Organization of Metal Nanoparticles in Light: Electrodynamics-Molecular Dynamics Simulations and Optical Binding Experiments.

    PubMed

    McCormack, Patrick; Han, Fei; Yan, Zijie

    2018-02-01

    Light-driven self-organization of metal nanoparticles (NPs) can lead to unique optical matter systems, yet simulation of such self-organization (i.e., optical binding) is a complex computational problem that increases nonlinearly with system size. Here we show that a combined electrodynamics-molecular dynamics simulation technique can simulate the trajectories and predict stable configurations of silver NPs in optical fields. The simulated dynamic equilibrium of a two-NP system matches the probability density of oscillations for two optically bound NPs obtained experimentally. The predicted stable configurations for up to eight NPs are further compared to experimental observations of silver NP clusters formed by optical binding in a Bessel beam. All configurations are confirmed to form in real systems, including pentagonal clusters with five-fold symmetry. Our combined simulations and experiments have revealed a diverse optical matter system formed by anisotropic optical binding interactions, providing a new strategy to discover artificial materials.

  16. Physics and performances of III-V nanowire broken-gap heterojunction TFETs using an efficient tight-binding mode-space NEGF model enabling million-atom nanowire simulations.

    PubMed

    Afzalian, A; Vasen, T; Ramvall, P; Shen, T-M; Wu, J; Passlack, M

    2018-06-27

    We report the capability to simulate in a quantum-mechanical atomistic fashion record-large nanowire devices, featuring several hundred to millions of atoms and a diameter up to 18.2 nm. We have employed a tight-binding mode-space NEGF technique demonstrating by far the fastest (up to 10 000  ×  faster) but accurate (error  <  1%) atomistic simulations to date. Such technique and capability opens new avenues to explore and understand the physics of nanoscale and mesoscopic devices dominated by quantum effects. In particular, our method addresses in an unprecedented way the technologically-relevant case of band-to-band tunneling (BTBT) in III-V nanowire broken-gap heterojunction tunnel-FETs (HTFETs). We demonstrate an accurate match of simulated BTBT currents to experimental measurements in a 12 nm diameter InAs NW and in an InAs/GaSb Esaki tunneling diode. We apply our TB MS simulations and report the first in-depth atomistic study of the scaling potential of III-V GAA nanowire HTFETs including the effect of electron-phonon scattering and discrete dopant impurity band tails, quantifying the benefits of this technology for low-power low-voltage CMOS applications.

  17. Development of a Standardized Cranial Phantom for Training and Optimization of Functional Stereotactic Operations.

    PubMed

    Krüger, Marie T; Coenen, Volker A; Egger, Karl; Shah, Mukesch; Reinacher, Peter C

    2018-06-13

    In recent years, simulations based on phantom models have become increasingly popular in the medical field. In the field of functional and stereotactic neurosurgery, a cranial phantom would be useful to train operative techniques, such as stereo-electroencephalography (SEEG), to establish new methods as well as to develop and modify radiological techniques. In this study, we describe the construction of a cranial phantom and show examples for it in stereotactic and functional neurosurgery and its applicability with different radiological modalities. We prepared a plaster skull filled with agar. A complete operation for deep brain stimulation (DBS) was simulated using directional leads. Moreover, a complete SEEG operation including planning, implantation of the electrodes, and intraoperative and postoperative imaging was simulated. An optimally customized cranial phantom is filled with 10% agar. At 7°C, it can be stored for approximately 4 months. A DBS and an SEEG procedure could be realistically simulated. Lead artifacts can be studied in CT, X-ray, rotational fluoroscopy, and MRI. This cranial phantom is a simple and effective model to simulate functional and stereotactic neurosurgical operations. This might be useful for teaching and training of neurosurgeons, establishing operations in a new center and for optimization of radiological examinations. © 2018 S. Karger AG, Basel.

  18. Characterization of cardiac flow in heart disease patients by computational fluid dynamics and 4D flow MRI

    NASA Astrophysics Data System (ADS)

    Lantz, Jonas; Gupta, Vikas; Henriksson, Lilian; Karlsson, Matts; Persson, Ander; Carhall, Carljohan; Ebbers, Tino

    2017-11-01

    In this study, cardiac blood flow was simulated using Computational Fluid Dynamics and compared to in vivo flow measurements by 4D Flow MRI. In total, nine patients with various heart diseases were studied. Geometry and heart wall motion for the simulations were obtained from clinical CT measurements, with 0.3x0.3x0.3 mm spatial resolution and 20 time frames covering one heartbeat. The CFD simulations included pulmonary veins, left atrium and ventricle, mitral and aortic valve, and ascending aorta. Mesh sizes were on the order of 6-16 million cells, depending on the size of the heart, in order to resolve both papillary muscles and trabeculae. The computed flow field agreed visually very well with 4D Flow MRI, with characteristic vortices and flow structures seen in both techniques. Regression analysis showed that peak flow rate as well as stroke volume had an excellent agreement for the two techniques. We demonstrated the feasibility, and more importantly, fidelity of cardiac flow simulations by comparing CFD results to in vivo measurements. Both qualitative and quantitative results agreed well with the 4D Flow MRI measurements. Also, the developed simulation methodology enables ``what if'' scenarios, such as optimization of valve replacement and other surgical procedures. Funded by the Wallenberg Foundation.

  19. Physics and performances of III–V nanowire broken-gap heterojunction TFETs using an efficient tight-binding mode-space NEGF model enabling million-atom nanowire simulations

    NASA Astrophysics Data System (ADS)

    Afzalian, A.; Vasen, T.; Ramvall, P.; Shen, T.-M.; Wu, J.; Passlack, M.

    2018-06-01

    We report the capability to simulate in a quantum-mechanical atomistic fashion record-large nanowire devices, featuring several hundred to millions of atoms and a diameter up to 18.2 nm. We have employed a tight-binding mode-space NEGF technique demonstrating by far the fastest (up to 10 000  ×  faster) but accurate (error  <  1%) atomistic simulations to date. Such technique and capability opens new avenues to explore and understand the physics of nanoscale and mesoscopic devices dominated by quantum effects. In particular, our method addresses in an unprecedented way the technologically-relevant case of band-to-band tunneling (BTBT) in III–V nanowire broken-gap heterojunction tunnel-FETs (HTFETs). We demonstrate an accurate match of simulated BTBT currents to experimental measurements in a 12 nm diameter InAs NW and in an InAs/GaSb Esaki tunneling diode. We apply our TB MS simulations and report the first in-depth atomistic study of the scaling potential of III–V GAA nanowire HTFETs including the effect of electron–phonon scattering and discrete dopant impurity band tails, quantifying the benefits of this technology for low-power low-voltage CMOS applications.

  20. Improved representation of situational awareness within a dismounted small combat unit constructive simulation

    NASA Astrophysics Data System (ADS)

    Lee, K. David; Colony, Mike

    2011-06-01

    Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.

  1. Probabilistic bias analysis in pharmacoepidemiology and comparative effectiveness research: a systematic review.

    PubMed

    Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-12-01

    We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Missile airframe simulation testbed: MANPADS (MAST-M) for test and evaluation of aircraft survivability equipment

    NASA Astrophysics Data System (ADS)

    Clements, Jim; Robinson, Richard; Bunt, Leslie; Robinson, Joe

    2011-06-01

    A number of techniques have been utilized to evaluate the performance of Aircraft Survivability Equipment (ASE) against threat Man-Portable Air Defense Systems (MANPADS). These techniques include flying actual threat MANPADS against stationary ASE with simulated aircraft signatures, testing installed ASE systems against simulated threat signatures, and laboratory hardware-in-the-loop (HWIL) testing with simulated aircraft and simulated missile signatures. All of these tests lack the realism of evaluating installed ASE against in-flight MANPADS on a terminal homing intercept path toward the actual ASE equipped aircraft. This limitation is due primarily to the current inability to perform non-destructive MANPADS/Aircraft flight testing. The U.S. Army Aviation and Missile Research and Development and Engineering Center (AMRDEC) is working to overcome this limitation with the development of a recoverable surrogate MANPADS missile system capable of engaging aircraft equipped with ASE while guaranteeing collision avoidance with the test aircraft. Under its Missile Airframe Simulation Testbed - MANPADS (MAST-M) program, the AMRDEC is developing a surrogate missile system which will utilize actual threat MANPADS seeker/guidance sections to control the flight of a surrogate missile which will perform a collision avoidance and recovery maneuver prior to intercept to insure non-destructive test and evaluation of the ASE and reuse of the MANPADS seeker/guidance section. The remainder of this paper provides an overview of this development program and intended use.

  3. SSAGES: Software Suite for Advanced General Ensemble Simulations

    NASA Astrophysics Data System (ADS)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  4. CHOLLA: A New Massively Parallel Hydrodynamics Code for Astrophysical Simulation

    NASA Astrophysics Data System (ADS)

    Schneider, Evan E.; Robertson, Brant E.

    2015-04-01

    We present Computational Hydrodynamics On ParaLLel Architectures (Cholla ), a new three-dimensional hydrodynamics code that harnesses the power of graphics processing units (GPUs) to accelerate astrophysical simulations. Cholla models the Euler equations on a static mesh using state-of-the-art techniques, including the unsplit Corner Transport Upwind algorithm, a variety of exact and approximate Riemann solvers, and multiple spatial reconstruction techniques including the piecewise parabolic method (PPM). Using GPUs, Cholla evolves the fluid properties of thousands of cells simultaneously and can update over 10 million cells per GPU-second while using an exact Riemann solver and PPM reconstruction. Owing to the massively parallel architecture of GPUs and the design of the Cholla code, astrophysical simulations with physically interesting grid resolutions (≳2563) can easily be computed on a single device. We use the Message Passing Interface library to extend calculations onto multiple devices and demonstrate nearly ideal scaling beyond 64 GPUs. A suite of test problems highlights the physical accuracy of our modeling and provides a useful comparison to other codes. We then use Cholla to simulate the interaction of a shock wave with a gas cloud in the interstellar medium, showing that the evolution of the cloud is highly dependent on its density structure. We reconcile the computed mixing time of a turbulent cloud with a realistic density distribution destroyed by a strong shock with the existing analytic theory for spherical cloud destruction by describing the system in terms of its median gas density.

  5. Monitoring the distribution of prompt gamma rays in boron neutron capture therapy using a multiple-scattering Compton camera: A Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Lee, Taewoong; Lee, Hyounggun; Lee, Wonho

    2015-10-01

    This study evaluated the use of Compton imaging technology to monitor prompt gamma rays emitted by 10B in boron neutron capture therapy (BNCT) applied to a computerized human phantom. The Monte Carlo method, including particle-tracking techniques, was used for simulation. The distribution of prompt gamma rays emitted by the phantom during irradiation with neutron beams is closely associated with the distribution of the boron in the phantom. Maximum likelihood expectation maximization (MLEM) method was applied to the information obtained from the detected prompt gamma rays to reconstruct the distribution of the tumor including the boron uptake regions (BURs). The reconstructed Compton images of the prompt gamma rays were combined with the cross-sectional images of the human phantom. Quantitative analysis of the intensity curves showed that all combined images matched the predetermined conditions of the simulation. The tumors including the BURs were distinguishable if they were more than 2 cm apart.

  6. Operational NDT simulator, towards human factors integration in simulated probability of detection

    NASA Astrophysics Data System (ADS)

    Rodat, Damien; Guibert, Frank; Dominguez, Nicolas; Calmon, Pierre

    2017-02-01

    In the aeronautic industry, the performance demonstration of Non-Destructive Testing (NDT) procedures relies on Probability Of Detection (POD) analyses. This statistical approach measures the ability of the procedure to detect a flaw with regard to one of its characteristic dimensions. The inspection chain is evaluated as a whole, including equipment configuration, probe effciency but also operator manipulations. Traditionally, a POD study requires an expensive campaign during which several operators apply the procedure on a large set of representative samples. Recently, new perspectives for the POD estimation have been introduced using NDT simulation to generate data. However, these approaches do not offer straightforward solutions to take the operator into account. The simulation of human factors, including cognitive aspects, often raises questions. To address these diffculties, we propose a concept of operational NDT simulator [1]. This work presents the first steps in the implementation of such simulator for ultrasound phased array inspection of composite parts containing Flat Bottom Holes (FBHs). The final system will look like a classical ultrasound testing equipment with a single exception: the displayed signals will be synthesized. Our hardware (ultrasound acquisition card, 3D position tracker) and software (position analysis, inspection scenario, synchronization, simulations) environments are developed as a bench to test the meta-modeling techniques able to provide fast-simulated realistic ultra-sound signals. The results presented here are obtained by on-the-fly merging of real and simulated signals. They confirm the feasibility of our approach: the replacement of real signals by purely simulated ones has been unnoticed by operators. We believe this simulator is a great prospect for POD evaluation including human factors, and may also find applications for training or procedure set-up.

  7. Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations

    NASA Astrophysics Data System (ADS)

    Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans

    2017-01-01

    Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.

  8. Gas-grain simulation experiment module conceptual design and gas-grain simulation facility breadboard development

    NASA Technical Reports Server (NTRS)

    Zamel, James M.; Petach, Michael; Gat, Nahum; Kropp, Jack; Luong, Christina; Wolff, Michael

    1993-01-01

    This report delineates the Option portion of the Phase A Gas-Grain Simulation Facility study. The conceptual design of a Gas-Grain Simulation Experiment Module (GGSEM) for Space Shuttle Middeck is discussed. In addition, a laboratory breadboard was developed during this study to develop a key function for the GGSEM and the GGSF, specifically, a solid particle cloud generating device. The breadboard design and test results are discussed and recommendations for further studies are included. The GGSEM is intended to fly on board a low earth orbit (LEO), manned platform. It will be used to perform a subset of the experiments planned for the GGSF for Space Station Freedom, as it can partially accommodate a number of the science experiments. The outcome of the experiments performed will provide an increased understanding of the operational requirements for the GGSF. The GGSEM will also act as a platform to accomplish technology development and proof-of-principle experiments for GGSF hardware, and to verify concepts and designs of hardware for GGSF. The GGSEM will allow assembled subsystems to be tested to verify facility level operation. The technology development that can be accommodated by the GGSEM includes: GGSF sample generation techniques, GGSF on-line diagnostics techniques, sample collection techniques, performance of various types of sensors for environmental monitoring, and some off-line diagnostics. Advantages and disadvantages of several LEO platforms available for GGSEM applications are identified and discussed. Several of the anticipated GGSF experiments require the deagglomeration and dispensing of dry solid particles into an experiment chamber. During the GGSF Phase A study, various techniques and devices available for the solid particle aerosol generator were reviewed. As a result of this review, solid particle deagglomeration and dispensing were identified as key undeveloped technologies in the GGSF design. A laboratory breadboard version of a solid particle generation system was developed and characterization tests performed. The breadboard hardware emulates the functions of the GGSF solid particle cloud generator in a ground laboratory environment, but with some modifications, can be used on other platforms.

  9. Gas-grain simulation experiment module conceptual design and gas-grain simulation facility breadboard development

    NASA Astrophysics Data System (ADS)

    Zamel, James M.; Petach, Michael; Gat, Nahum; Kropp, Jack; Luong, Christina; Wolff, Michael

    1993-12-01

    This report delineates the Option portion of the Phase A Gas-Grain Simulation Facility study. The conceptual design of a Gas-Grain Simulation Experiment Module (GGSEM) for Space Shuttle Middeck is discussed. In addition, a laboratory breadboard was developed during this study to develop a key function for the GGSEM and the GGSF, specifically, a solid particle cloud generating device. The breadboard design and test results are discussed and recommendations for further studies are included. The GGSEM is intended to fly on board a low earth orbit (LEO), manned platform. It will be used to perform a subset of the experiments planned for the GGSF for Space Station Freedom, as it can partially accommodate a number of the science experiments. The outcome of the experiments performed will provide an increased understanding of the operational requirements for the GGSF. The GGSEM will also act as a platform to accomplish technology development and proof-of-principle experiments for GGSF hardware, and to verify concepts and designs of hardware for GGSF. The GGSEM will allow assembled subsystems to be tested to verify facility level operation. The technology development that can be accommodated by the GGSEM includes: GGSF sample generation techniques, GGSF on-line diagnostics techniques, sample collection techniques, performance of various types of sensors for environmental monitoring, and some off-line diagnostics. Advantages and disadvantages of several LEO platforms available for GGSEM applications are identified and discussed. Several of the anticipated GGSF experiments require the de-agglomeration and dispensing of dry solid particles into an experiment chamber. During the GGSF Phase A study, various techniques and devices available for the solid particle aerosol generator were reviewed. As a result of this review, solid particle de-agglomeration and dispensing were identified as key undeveloped technologies in the GGSF design. A laboratory breadboard version of a solid particle generation system was developed and characterization tests performed. The breadboard hardware emulates the functions of the GGSF solid particle cloud generator in a ground laboratory environment, but with some modifications, can be used on other platforms.

  10. Assessment of metal artifact reduction methods in pelvic CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdoli, Mehrsima; Mehranian, Abolfazl; Ailianou, Angeliki

    2016-04-15

    Purpose: Metal artifact reduction (MAR) produces images with improved quality potentially leading to confident and reliable clinical diagnosis and therapy planning. In this work, the authors evaluate the performance of five MAR techniques for the assessment of computed tomography images of patients with hip prostheses. Methods: Five MAR algorithms were evaluated using simulation and clinical studies. The algorithms included one-dimensional linear interpolation (LI) of the corrupted projection bins in the sinogram, two-dimensional interpolation (2D), a normalized metal artifact reduction (NMAR) technique, a metal deletion technique, and a maximum a posteriori completion (MAPC) approach. The algorithms were applied to ten simulatedmore » datasets as well as 30 clinical studies of patients with metallic hip implants. Qualitative evaluations were performed by two blinded experienced radiologists who ranked overall artifact severity and pelvic organ recognition for each algorithm by assigning scores from zero to five (zero indicating totally obscured organs with no structures identifiable and five indicating recognition with high confidence). Results: Simulation studies revealed that 2D, NMAR, and MAPC techniques performed almost equally well in all regions. LI falls behind the other approaches in terms of reducing dark streaking artifacts as well as preserving unaffected regions (p < 0.05). Visual assessment of clinical datasets revealed the superiority of NMAR and MAPC in the evaluated pelvic organs and in terms of overall image quality. Conclusions: Overall, all methods, except LI, performed equally well in artifact-free regions. Considering both clinical and simulation studies, 2D, NMAR, and MAPC seem to outperform the other techniques.« less

  11. A novel process for introducing a new intraoperative program: a multidisciplinary paradigm for mitigating hazards and improving patient safety.

    PubMed

    Rodriguez-Paz, Jose M; Mark, Lynette J; Herzer, Kurt R; Michelson, James D; Grogan, Kelly L; Herman, Joseph; Hunt, David; Wardlow, Linda; Armour, Elwood P; Pronovost, Peter J

    2009-01-01

    Since the Institute of Medicine's report, To Err is Human, was published, numerous interventions have been designed and implemented to correct the defects that lead to medical errors and adverse events; however, most efforts were largely reactive. Safety, communication, team performance, and efficiency are areas of care that attract a great deal of attention, especially regarding the introduction of new technologies, techniques, and procedures. We describe a multidisciplinary process that was implemented at our hospital to identify and mitigate hazards before the introduction of a new technique: high-dose-rate intraoperative radiation therapy, (HDR-IORT). A multidisciplinary team of surgeons, anesthesiologists, radiation oncologists, physicists, nurses, hospital risk managers, and equipment specialists used a structured process that included in situ clinical simulation to uncover concerns among care providers and to prospectively identify and mitigate defects for patients who would undergo surgery using the HDR-IORT technique. We identified and corrected 20 defects in the simulated patient care process before application to actual patients. Subsequently, eight patients underwent surgery using the HDR-IORT technique with no recurrence of simulation-identified or unanticipated defects. Multiple benefits were derived from the use of this systematic process to introduce the HDR-IORT technique; namely, the safety and efficiency of care for this select patient population was optimized, and this process mitigated harmful or adverse events before the inclusion of actual patients. Further work is needed, but the process outlined in this paper can be universally applied to the introduction of any new technologies, treatments, or procedures.

  12. Radiation dose reduction using a neck detection algorithm for single spiral brain and cervical spine CT acquisition in the trauma setting.

    PubMed

    Ardley, Nicholas D; Lau, Ken K; Buchan, Kevin

    2013-12-01

    Cervical spine injuries occur in 4-8 % of adults with head trauma. Dual acquisition technique has been traditionally used for the CT scanning of brain and cervical spine. The purpose of this study was to determine the efficacy of radiation dose reduction by using a single acquisition technique that incorporated both anatomical regions with a dedicated neck detection algorithm. Thirty trauma patients for brain and cervical spine CT were included and were scanned with the single acquisition technique. The radiation doses from the single CT acquisition technique with the neck detection algorithm, which allowed appropriate independent dose administration relevant to brain and cervical spine regions, were recorded. Comparison was made both to the doses calculated from the simulation of the traditional dual acquisitions with matching parameters, and to the doses of retrospective dual acquisition legacy technique with the same sample size. The mean simulated dose for the traditional dual acquisition technique was 3.99 mSv, comparable to the average dose of 4.2 mSv from 30 previous patients who had CT of brain and cervical spine as dual acquisitions. The mean dose from the single acquisition technique was 3.35 mSv, resulting in a 16 % overall dose reduction. The images from the single acquisition technique were of excellent diagnostic quality. The new single acquisition CT technique incorporating the neck detection algorithm for brain and cervical spine significantly reduces the overall radiation dose by eliminating the unavoidable overlapping range between 2 anatomical regions which occurs with the traditional dual acquisition technique.

  13. Summary of CPAS EDU Testing Analysis Results

    NASA Technical Reports Server (NTRS)

    Romero, Leah M.; Bledsoe, Kristin J.; Davidson, John.; Engert, Meagan E.; Fraire, Usbaldo, Jr.; Galaviz, Fernando S.; Galvin, Patrick J.; Ray, Eric S.; Varela, Jose

    2015-01-01

    The Orion program's Capsule Parachute Assembly System (CPAS) project is currently conducting its third generation of testing, the Engineering Development Unit (EDU) series. This series utilizes two test articles, a dart-shaped Parachute Compartment Drop Test Vehicle (PCDTV) and capsule-shaped Parachute Test Vehicle (PTV), both of which include a full size, flight-like parachute system and require a pallet delivery system for aircraft extraction. To date, 15 tests have been completed, including six with PCDTVs and nine with PTVs. Two of the PTV tests included the Forward Bay Cover (FBC) provided by Lockheed Martin. Advancements in modeling techniques applicable to parachute fly-out, vehicle rate of descent, torque, and load train, also occurred during the EDU testing series. An upgrade from a composite to an independent parachute simulation allowed parachute modeling at a higher level of fidelity than during previous generations. The complexity of separating the test vehicles from their pallet delivery systems necessitated the use the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulator for modeling mated vehicle aircraft extraction and separation. This paper gives an overview of each EDU test and summarizes the development of CPAS analysis tools and techniques during EDU testing.

  14. Copper Doping Improves Hydroxyapatite Sorption for Arsenate in Simulated Groundwaters

    DTIC Science & Technology

    2010-02-15

    Sciences, Notre Dame, Indiana 46556; Department of Environmental and Civil Engineering, Dallas, Texas 75205; and U.S. Army Engineer Research and...widely used to immobilize a wide range of heavy metals in water and soils, including lead, cadmium , zinc, uranium, copper, and nickel (6-9). The...the copper doping technique also has the potential to promote the sorptions of heavy metals including cadmium , zinc, lead, and uranium, whose

  15. Multislice spiral CT simulator for dynamic cardiopulmonary studies

    NASA Astrophysics Data System (ADS)

    De Francesco, Silvia; Ferreira da Silva, Augusto M.

    2002-04-01

    We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.

  16. Simulation of Neural Firing Dynamics: A Student Project.

    ERIC Educational Resources Information Center

    Kletsky, E. J.

    This paper describes a student project in digital simulation techniques that is part of a graduate systems analysis course entitled Biosimulation. The students chose different simulation techniques to solve a problem related to the neuron model. (MLH)

  17. Prototype Common Bus Spacecraft: Hover Test Implementation and Results. Revision, Feb. 26, 2009

    NASA Technical Reports Server (NTRS)

    Hine, Butler Preston; Turner, Mark; Marshall, William S.

    2009-01-01

    In order to develop the capability to evaluate control system technologies, NASA Ames Research Center (Ames) began a test program to build a Hover Test Vehicle (HTV) - a ground-based simulated flight vehicle. The HTV would integrate simulated propulsion, avionics, and sensors into a simulated flight structure, and fly that test vehicle in terrestrial conditions intended to simulate a flight environment, in particular for attitude control. The ultimate purpose of the effort at Ames is to determine whether the low-cost hardware and flight software techniques are viable for future low cost missions. To enable these engineering goals, the project sought to develop a team, processes and procedures capable of developing, building and operating a fully functioning vehicle including propulsion, GN&C, structure, power and diagnostic sub-systems, through the development of the simulated vehicle.

  18. CHARMM-GUI PDB manipulator for advanced modeling and simulations of proteins containing nonstandard residues.

    PubMed

    Jo, Sunhwan; Cheng, Xi; Islam, Shahidul M; Huang, Lei; Rui, Huan; Zhu, Allen; Lee, Hui Sun; Qi, Yifei; Han, Wei; Vanommeslaeghe, Kenno; MacKerell, Alexander D; Roux, Benoît; Im, Wonpil

    2014-01-01

    CHARMM-GUI, http://www.charmm-gui.org, is a web-based graphical user interface to prepare molecular simulation systems and input files to facilitate the usage of common and advanced simulation techniques. Since it is originally developed in 2006, CHARMM-GUI has been widely adopted for various purposes and now contains a number of different modules designed to setup a broad range of simulations including free energy calculation and large-scale coarse-grained representation. Here, we describe functionalities that have recently been integrated into CHARMM-GUI PDB Manipulator, such as ligand force field generation, incorporation of methanethiosulfonate spin labels and chemical modifiers, and substitution of amino acids with unnatural amino acids. These new features are expected to be useful in advanced biomolecular modeling and simulation of proteins. © 2014 Elsevier Inc. All rights reserved.

  19. Visualization of AMR data with multi-level dual-mesh interpolation.

    PubMed

    Moran, Patrick J; Ellsworth, David

    2011-12-01

    We present a new technique for providing interpolation within cell-centered Adaptive Mesh Refinement (AMR) data that achieves C(0) continuity throughout the 3D domain. Our technique improves on earlier work in that it does not require that adjacent patches differ by at most one refinement level. Our approach takes the dual of each mesh patch and generates "stitching cells" on the fly to fill the gaps between dual meshes. We demonstrate applications of our technique with data from Enzo, an AMR cosmological structure formation simulation code. We show ray-cast visualizations that include contributions from particle data (dark matter and stars, also output by Enzo) and gridded hydrodynamic data. We also show results from isosurface studies, including surfaces in regions where adjacent patches differ by more than one refinement level. © 2011 IEEE

  20. Using cognitive architectures to study issues in team cognition in a complex task environment

    NASA Astrophysics Data System (ADS)

    Smart, Paul R.; Sycara, Katia; Tang, Yuqing

    2014-05-01

    Cognitive social simulation is a computer simulation technique that aims to improve our understanding of the dynamics of socially-situated and socially-distributed cognition. This makes cognitive social simulation techniques particularly appealing as a means to undertake experiments into team cognition. The current paper reports on the results of an ongoing effort to develop a cognitive social simulation capability that can be used to undertake studies into team cognition using the ACT-R cognitive architecture. This capability is intended to support simulation experiments using a team-based problem solving task, which has been used to explore the effect of different organizational environments on collective problem solving performance. The functionality of the ACT-R-based cognitive social simulation capability is presented and a number of areas of future development work are outlined. The paper also describes the motivation for adopting cognitive architectures in the context of social simulation experiments and presents a number of research areas where cognitive social simulation may be useful in developing a better understanding of the dynamics of team cognition. These include the use of cognitive social simulation to study the role of cognitive processes in determining aspects of communicative behavior, as well as the impact of communicative behavior on the shaping of task-relevant cognitive processes (e.g., the social shaping of individual and collective memory as a result of communicative exchanges). We suggest that the ability to perform cognitive social simulation experiments in these areas will help to elucidate some of the complex interactions that exist between cognitive, social, technological and informational factors in the context of team-based problem-solving activities.

  1. Chaos in plasma simulation and experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watts, C.; Newman, D.E.; Sprott, J.C.

    1993-09-01

    We investigate the possibility that chaos and simple determinism are governing the dynamics of reversed field pinch (RFP) plasmas using data from both numerical simulations and experiment. A large repertoire of nonlinear analysis techniques is used to identify low dimensional chaos. These tools include phase portraits and Poincard sections, correlation dimension, the spectrum of Lyapunov exponents and short term predictability. In addition, nonlinear noise reduction techniques are applied to the experimental data in an attempt to extract any underlying deterministic dynamics. Two model systems are used to simulate the plasma dynamics. These are -the DEBS code, which models global RFPmore » dynamics, and the dissipative trapped electron mode (DTEM) model, which models drift wave turbulence. Data from both simulations show strong indications of low,dimensional chaos and simple determinism. Experimental data were obtained from the Madison Symmetric Torus RFP and consist of a wide array of both global and local diagnostic signals. None of the signals shows any indication of low dimensional chaos or other simple determinism. Moreover, most of the analysis tools indicate the experimental system is very high dimensional with properties similar to noise. Nonlinear noise reduction is unsuccessful at extracting an underlying deterministic system.« less

  2. High-performance computational fluid dynamics: a custom-code approach

    NASA Astrophysics Data System (ADS)

    Fannon, James; Loiseau, Jean-Christophe; Valluri, Prashant; Bethune, Iain; Náraigh, Lennon Ó.

    2016-07-01

    We introduce a modified and simplified version of the pre-existing fully parallelized three-dimensional Navier-Stokes flow solver known as TPLS. We demonstrate how the simplified version can be used as a pedagogical tool for the study of computational fluid dynamics (CFDs) and parallel computing. TPLS is at its heart a two-phase flow solver, and uses calls to a range of external libraries to accelerate its performance. However, in the present context we narrow the focus of the study to basic hydrodynamics and parallel computing techniques, and the code is therefore simplified and modified to simulate pressure-driven single-phase flow in a channel, using only relatively simple Fortran 90 code with MPI parallelization, but no calls to any other external libraries. The modified code is analysed in order to both validate its accuracy and investigate its scalability up to 1000 CPU cores. Simulations are performed for several benchmark cases in pressure-driven channel flow, including a turbulent simulation, wherein the turbulence is incorporated via the large-eddy simulation technique. The work may be of use to advanced undergraduate and graduate students as an introductory study in CFDs, while also providing insight for those interested in more general aspects of high-performance computing.

  3. Mathematical and Numerical Techniques in Energy and Environmental Modeling

    NASA Astrophysics Data System (ADS)

    Chen, Z.; Ewing, R. E.

    Mathematical models have been widely used to predict, understand, and optimize many complex physical processes, from semiconductor or pharmaceutical design to large-scale applications such as global weather models to astrophysics. In particular, simulation of environmental effects of air pollution is extensive. Here we address the need for using similar models to understand the fate and transport of groundwater contaminants and to design in situ remediation strategies. Three basic problem areas need to be addressed in the modeling and simulation of the flow of groundwater contamination. First, one obtains an effective model to describe the complex fluid/fluid and fluid/rock interactions that control the transport of contaminants in groundwater. This includes the problem of obtaining accurate reservoir descriptions at various length scales and modeling the effects of this heterogeneity in the reservoir simulators. Next, one develops accurate discretization techniques that retain the important physical properties of the continuous models. Finally, one develops efficient numerical solution algorithms that utilize the potential of the emerging computing architectures. We will discuss recent advances and describe the contribution of each of the papers in this book in these three areas. Keywords: reservoir simulation, mathematical models, partial differential equations, numerical algorithms

  4. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  5. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  6. Advanced Navigation Strategies For Asteroid Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Getzandanner, K.; Bauman, J.; Williams, B.; Carpenter, J.

    2010-01-01

    Flyby and rendezvous missions to asteroids have been accomplished using navigation techniques derived from experience gained in planetary exploration. This paper presents analysis of advanced navigation techniques required to meet unique challenges for precision navigation to acquire a sample from an asteroid and return it to Earth. These techniques rely on tracking data types such as spacecraft-based laser ranging and optical landmark tracking in addition to the traditional Earth-based Deep Space Network radio metric tracking. A systematic study of navigation strategy, including the navigation event timeline and reduction in spacecraft-asteroid relative errors, has been performed using simulation and covariance analysis on a representative mission.

  7. Open surgical simulation--a review.

    PubMed

    Davies, Jennifer; Khatib, Manaf; Bello, Fernando

    2013-01-01

    Surgical simulation has benefited from a surge in interest over the last decade as a result of the increasing need for a change in the traditional apprentice model of teaching surgery. However, despite the recent interest in surgical simulation as an adjunct to surgical training, most of the literature focuses on laparoscopic, endovascular, and endoscopic surgical simulation with very few studies scrutinizing open surgical simulation and its benefit to surgical trainees. The aim of this review is to summarize the current standard of available open surgical simulators and to review the literature on the benefits of open surgical simulation. Open surgical simulators currently used include live animals, cadavers, bench models, virtual reality, and software-based computer simulators. In the current literature, there are 18 different studies (including 6 randomized controlled trials and 12 cohort studies) investigating the efficacy of open surgical simulation using live animal, bench, and cadaveric models in many surgical specialties including general, cardiac, trauma, vascular, urologic, and gynecologic surgery. The current open surgical simulation studies show, in general, a significant benefit of open surgical simulation in developing the surgical skills of surgical trainees. However, these studies have their limitations including a low number of participants, variable assessment standards, and a focus on short-term results often with no follow-up assessment. The skills needed for open surgical procedures are the essential basis that a surgical trainee needs to grasp before attempting more technical procedures such as laparoscopic procedures. In this current climate of medical practice with reduced hours of surgical exposure for trainees and where the patient's safety and outcome is key, open surgical simulation is a promising adjunct to modern surgical training, filling the void between surgeons being trained in a technique and a surgeon achieving fluency in that open surgical procedure. Better quality research is needed into the benefits of open surgical simulation, and this would hopefully stimulate further development of simulators with more accurate and objective assessment tools. © 2013 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  8. Technique for simulating peak-flow hydrographs in Maryland

    USGS Publications Warehouse

    Dillow, Jonathan J.A.

    1998-01-01

    The efficient design and management of many bridges, culverts, embankments, and flood-protection structures may require the estimation of time-of-inundation and (or) storage of floodwater relating to such structures. These estimates can be made on the basis of information derived from the peak-flow hydrograph. Average peak-flow hydrographs corresponding to a peak discharge of specific recurrence interval can be simulated for drainage basins having drainage areas less than 500 square miles in Maryland, using a direct technique of known accuracy. The technique uses dimensionless hydrographs in conjunction with estimates of basin lagtime and instantaneous peak flow. Ordinary least-squares regression analysis was used to develop an equation for estimating basin lagtime in Maryland. Drainage area, main channel slope, forest cover, and impervious area were determined to be the significant explanatory variables necessary to estimate average basin lagtime at the 95-percent confidence interval. Qualitative variables included in the equation adequately correct for geographic bias across the State. The average standard error of prediction associated with the equation is approximated as plus or minus (+/-) 37.6 percent. Volume correction factors may be applied to the basin lagtime on the basis of a comparison between actual and estimated hydrograph volumes prior to hydrograph simulation. Three dimensionless hydrographs were developed and tested using data collected during 278 significant rainfall-runoff events at 81 stream-gaging stations distributed throughout Maryland and Delaware. The data represent a range of drainage area sizes and basin conditions. The technique was verified by applying it to the simulation of 20 peak-flow events and comparing actual and simulated hydrograph widths at 50 and 75 percent of the observed peak-flow levels. The events chosen are considered extreme in that the average recurrence interval of the selected peak flows is 130 years. The average standard errors of prediction were +/- 61 and +/- 56 percent at the 50 and 75 percent of peak-flow hydrograph widths, respectively.

  9. Nonlinear relaxation algorithms for circuit simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saleh, R.A.

    Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less

  10. Control Performance, Aerodynamic Modeling, and Validation of Coupled Simulation Techniques for Guided Projectile Roll Dynamics

    DTIC Science & Technology

    2014-11-01

    39–44) has been explored in depth in the literature. Of particular interest for this study are investigations into roll control. Isolating the...Control Performance, Aerodynamic Modeling, and Validation of Coupled Simulation Techniques for Guided Projectile Roll Dynamics by Jubaraj...Simulation Techniques for Guided Projectile Roll Dynamics Jubaraj Sahu, Frank Fresconi, and Karen R. Heavey Weapons and Materials Research

  11. Effect of Simulation Techniques and Lecture Method on Students' Academic Performance in Mafoni Day Secondary School Maiduguri, Borno State, Nigeria

    ERIC Educational Resources Information Center

    Bello, Sulaiman; Ibi, Mustapha Baba; Bukar, Ibrahim Bulama

    2016-01-01

    The study examined the effect of simulation technique and lecture method on students' academic performance in Mafoni Day Secondary School, Maiduguri. The study used both simulation technique and lecture methods of teaching at the basic level of education in the teaching/learning environment. The study aimed at determining the best predictor among…

  12. Analysis of simulated advanced spaceborne thermal emission and reflection (ASTER) radiometer data of the Iron Hill, Colorado, study area for mapping lithologies

    USGS Publications Warehouse

    Rowan, L.C.

    1998-01-01

    The advanced spaceborne thermal emission and reflection (ASTER) radiometer was designed to record reflected energy in nine channels with 15 or 30 m resolution, including stereoscopic images, and emitted energy in five channels with 90 m resolution from the NASA Earth Observing System AM1 platform. A simulated ASTER data set was produced for the Iron Hill, Colorado, study area by resampling calibrated, registered airborne visible/infrared imaging spectrometer (AVIRIS) data, and thermal infrared multispectral scanner (TIMS) data to the appropriate spatial and spectral parameters. A digital elevation model was obtained to simulate ASTER-derived topographic data. The main lithologic units in the area are granitic rocks and felsite into which a carbonatite stock and associated alkalic igneous rocks were intruded; these rocks are locally covered by Jurassic sandstone, Tertiary rhyolitic tuff, and colluvial deposits. Several methods were evaluated for mapping the main lithologic units, including the unsupervised classification and spectral curve-matching techniques. In the five thermal-infrared (TIR) channels, comparison of the results of linear spectral unmixing and unsupervised classification with published geologic maps showed that the main lithologic units were mapped, but large areas with moderate to dense tree cover were not mapped in the TIR data. Compared to TIMS data, simulated ASTER data permitted slightly less discrimination in the mafic alkalic rock series, and carbonatite was not mapped in the TIMS nor in the simulated ASTER TIR data. In the nine visible and near-infrared channels, unsupervised classification did not yield useful results, but both the spectral linear unmixing and the matched filter techniques produced useful results, including mapping calcitic and dolomitic carbonatite exposures, travertine in hot spring deposits, kaolinite in argillized sandstone and tuff, and muscovite in sericitized granite and felsite, as well as commonly occurring illite/muscovite. However, the distinction made in AVIRIS data between calcite and dolomite was not consistently feasible in the simulated ASTER data. Comparison of the lithologic information produced by spectral analysis of the simulated ASTER data to a photogeologic interpretation of a simulated ASTER color image illustrates the high potential of spectral analysis of ASTER data to geologic interpretation. This paper is not subject to U.S. copyright. Published in 1998 by the American Geophysical Union.

  13. High Performance Parallel Computational Nanotechnology

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    At a recent press conference, NASA Administrator Dan Goldin encouraged NASA Ames Research Center to take a lead role in promoting research and development of advanced, high-performance computer technology, including nanotechnology. Manufacturers of leading-edge microprocessors currently perform large-scale simulations in the design and verification of semiconductor devices and microprocessors. Recently, the need for this intensive simulation and modeling analysis has greatly increased, due in part to the ever-increasing complexity of these devices, as well as the lessons of experiences such as the Pentium fiasco. Simulation, modeling, testing, and validation will be even more important for designing molecular computers because of the complex specification of millions of atoms, thousands of assembly steps, as well as the simulation and modeling needed to ensure reliable, robust and efficient fabrication of the molecular devices. The software for this capacity does not exist today, but it can be extrapolated from the software currently used in molecular modeling for other applications: semi-empirical methods, ab initio methods, self-consistent field methods, Hartree-Fock methods, molecular mechanics; and simulation methods for diamondoid structures. In as much as it seems clear that the application of such methods in nanotechnology will require powerful, highly powerful systems, this talk will discuss techniques and issues for performing these types of computations on parallel systems. We will describe system design issues (memory, I/O, mass storage, operating system requirements, special user interface issues, interconnects, bandwidths, and programming languages) involved in parallel methods for scalable classical, semiclassical, quantum, molecular mechanics, and continuum models; molecular nanotechnology computer-aided designs (NanoCAD) techniques; visualization using virtual reality techniques of structural models and assembly sequences; software required to control mini robotic manipulators for positional control; scalable numerical algorithms for reliability, verifications and testability. There appears no fundamental obstacle to simulating molecular compilers and molecular computers on high performance parallel computers, just as the Boeing 777 was simulated on a computer before manufacturing it.

  14. Mono and multi-objective optimization techniques applied to a large range of industrial test cases using Metamodel assisted Evolutionary Algorithms

    NASA Astrophysics Data System (ADS)

    Fourment, Lionel; Ducloux, Richard; Marie, Stéphane; Ejday, Mohsen; Monnereau, Dominique; Massé, Thomas; Montmitonnet, Pierre

    2010-06-01

    The use of material processing numerical simulation allows a strategy of trial and error to improve virtual processes without incurring material costs or interrupting production and therefore save a lot of money, but it requires user time to analyze the results, adjust the operating conditions and restart the simulation. Automatic optimization is the perfect complement to simulation. Evolutionary Algorithm coupled with metamodelling makes it possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. Ten industrial partners have been selected to cover the different area of the mechanical forging industry and provide different examples of the forming simulation tools. It aims to demonstrate that it is possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. The large computational time is handled by a metamodel approach. It allows interpolating the objective function on the entire parameter space by only knowing the exact function values at a reduced number of "master points". Two algorithms are used: an evolution strategy combined with a Kriging metamodel and a genetic algorithm combined with a Meshless Finite Difference Method. The later approach is extended to multi-objective optimization. The set of solutions, which corresponds to the best possible compromises between the different objectives, is then computed in the same way. The population based approach allows using the parallel capabilities of the utilized computer with a high efficiency. An optimization module, fully embedded within the Forge2009 IHM, makes possible to cover all the defined examples, and the use of new multi-core hardware to compute several simulations at the same time reduces the needed time dramatically. The presented examples demonstrate the method versatility. They include billet shape optimization of a common rail, the cogging of a bar and a wire drawing problem.

  15. A review of mathematical modeling and simulation of controlled-release fertilizers.

    PubMed

    Irfan, Sayed Ameenuddin; Razali, Radzuan; KuShaari, KuZilati; Mansor, Nurlidia; Azeem, Babar; Ford Versypt, Ashlee N

    2018-02-10

    Nutrients released into soils from uncoated fertilizer granules are lost continuously due to volatilization, leaching, denitrification, and surface run-off. These issues have caused economic loss due to low nutrient absorption efficiency and environmental pollution due to hazardous emissions and water eutrophication. Controlled-release fertilizers (CRFs) can change the release kinetics of the fertilizer nutrients through an abatement strategy to offset these issues by providing the fertilizer content in synchrony with the metabolic needs of the plants. Parametric analysis of release characteristics of CRFs is of paramount importance for the design and development of new CRFs. However, the experimental approaches are not only time consuming, but they are also cumbersome and expensive. Scientists have introduced mathematical modeling techniques to predict the release of nutrients from the CRFs to elucidate fundamental understanding of the dynamics of the release processes and to design new CRFs in a shorter time and with relatively lower cost. This paper reviews and critically analyzes the latest developments in the mathematical modeling and simulation techniques that have been reported for the characteristics and mechanisms of nutrient release from CRFs. The scope of this review includes the modeling and simulations techniques used for coated, controlled-release fertilizers. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. JASMINE simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Gouda, N.; Yano, T.; Sako, N.; Hatsutori, Y.; Tanaka, T.; Yamauchi, M.

    We explain simulation tools in JASMINE project(JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations of error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented(OO) methodologies are ideal tools for the simulation system of JASMINE(the JASMINE simulator). The simulation system should include all objects in JASMINE such as observation techniques, models of instruments and bus design, orbit, data transfer, data analysis etc. in order to resolve all issues which can be expected beforehand and make it easy to cope with some unexpected problems which might occur during the mission of JASMINE. So, the JASMINE Simulator is designed as handling events such as photons from astronomical objects, control signals for devices, disturbances for satellite attitude, by instruments such as mirrors and detectors, successively. The simulator is also applied to the technical demonstration "Nano-JASMINE". The accuracy of ordinary sensor is not enough for initial phase attitude control. Mission instruments may be a good sensor for this purpose. The problem of attitude control in initial phase is a good example of this software because the problem is closely related to both mission instruments and satellite bus systems.

  17. JASMINE Simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Gouda, N.; Yano, T.; Kobayashi, Y.; Suganuma, M.; Tsujimoto, T.; Sako, N.; Hatsutori, Y.; Tanaka, T.

    2006-08-01

    We explain simulation tools in JASMINE project (JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations of error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented (OO) methodologies are ideal tools for the simulation system of JASMINE (the JASMINE simulator). The simulation system should include all objects in JASMINE such as observation techniques, models of instruments and bus design, orbit, data transfer, data analysis etc. in order to resolve all issues which can be expected beforehand and make it easy to cope with some unexpected problems which might occur during the mission of JASMINE. So, the JASMINE Simulator is designed as handling events such as photons from astronomical objects, control signals for devices, disturbances for satellite attitude, by instruments such as mirrors and detectors, successively. The simulator is also applied to the technical demonstration "Nano-JASMINE". The accuracy of ordinary sensor is not enough for initial phase attitude control. Mission instruments may be a good sensor for this purpose. The problem of attitude control in initial phase is a good example of this software because the problem is closely related to both mission instruments and satellite bus systems.

  18. Machine Learning Predictions of a Multiresolution Climate Model Ensemble

    NASA Astrophysics Data System (ADS)

    Anderson, Gemma J.; Lucas, Donald D.

    2018-05-01

    Statistical models of high-resolution climate models are useful for many purposes, including sensitivity and uncertainty analyses, but building them can be computationally prohibitive. We generated a unique multiresolution perturbed parameter ensemble of a global climate model. We use a novel application of a machine learning technique known as random forests to train a statistical model on the ensemble to make high-resolution model predictions of two important quantities: global mean top-of-atmosphere energy flux and precipitation. The random forests leverage cheaper low-resolution simulations, greatly reducing the number of high-resolution simulations required to train the statistical model. We demonstrate that high-resolution predictions of these quantities can be obtained by training on an ensemble that includes only a small number of high-resolution simulations. We also find that global annually averaged precipitation is more sensitive to resolution changes than to any of the model parameters considered.

  19. Computational Analysis and Simulation of Empathic Behaviors: A Survey of Empathy Modeling with Behavioral Signal Processing Framework

    PubMed Central

    Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.

    2017-01-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830

  20. An Introduction to System-Level, Steady-State and Transient Modeling and Optimization of High-Power-Density Thermoelectric Generator Devices Made of Segmented Thermoelectric Elements

    NASA Astrophysics Data System (ADS)

    Crane, D. T.

    2011-05-01

    High-power-density, segmented, thermoelectric (TE) elements have been intimately integrated into heat exchangers, eliminating many of the loss mechanisms of conventional TE assemblies, including the ceramic electrical isolation layer. Numerical models comprising simultaneously solved, nonlinear, energy balance equations have been created to simulate these novel architectures. Both steady-state and transient models have been created in a MATLAB/Simulink environment. The models predict data from experiments in various configurations and applications over a broad range of temperature, flow, and current conditions for power produced, efficiency, and a variety of other important outputs. Using the validated models, devices and systems are optimized using advanced multiparameter optimization techniques. Devices optimized for particular steady-state operating conditions can then be dynamically simulated in a transient operating model. The transient model can simulate a variety of operating conditions including automotive and truck drive cycles.

  1. ASCAT soil moisture data assimilation through the Ensemble Kalman Filter for improving streamflow simulation in Mediterranean catchments

    NASA Astrophysics Data System (ADS)

    Loizu, Javier; Massari, Christian; Álvarez-Mozos, Jesús; Casalí, Javier; Goñi, Mikel

    2016-04-01

    Assimilation of Surface Soil Moisture (SSM) observations obtained from remote sensing techniques have been shown to improve streamflow prediction at different time scales of hydrological modeling. Different sensors and methods have been tested for their application in SSM estimation, especially in the microwave region of the electromagnetic spectrum. The available observation devices include passive microwave sensors such as the Advanced Microwave Scanning Radiometer - Earth Observation System (AMSR-E) onboard the Aqua satellite and the Soil Moisture and Ocean Salinity (SMOS) mission. On the other hand, active microwave systems include Scatterometers (SCAT) onboard the European Remote Sensing satellites (ERS-1/2) and the Advanced Scatterometer (ASCAT) onboard MetOp-A satellite. Data assimilation (DA) include different techniques that have been applied in hydrology and other fields for decades. These techniques include, among others, Kalman Filtering (KF), Variational Assimilation or Particle Filtering. From the initial KF method, different techniques were developed to suit its application to different systems. The Ensemble Kalman Filter (EnKF), extensively applied in hydrological modeling improvement, shows its capability to deal with nonlinear model dynamics without linearizing model equations, as its main advantage. The objective of this study was to investigate whether data assimilation of SSM ASCAT observations, through the EnKF method, could improve streamflow simulation of mediterranean catchments with TOPLATS hydrological complex model. The DA technique was programmed in FORTRAN, and applied to hourly simulations of TOPLATS catchment model. TOPLATS (TOPMODEL-based Land-Atmosphere Transfer Scheme) was applied on its lumped version for two mediterranean catchments of similar size, located in northern Spain (Arga, 741 km2) and central Italy (Nestore, 720 km2). The model performs a separated computation of energy and water balances. In those balances, the soil is divided into two layers, the upper Surface Zone (SZ), and the deeper Transmission Zone (TZ). In this study, the SZ depth was fixed to 5 cm, for adequate assimilation of observed data. Available data was distributed as follows: first, the model was calibrated for the 2001-2007 period; then the 2007-2010 period was used for satellite data rescaling purposes. Finally, data assimilation was applied during the validation (2010-2013) period. Application of the EnKF required the following steps: 1) rescaling of satellite data, 2) transformation of rescaled data into Soil Water Index (SWI) through a moving average filter, where a T = 9 calibrated value was applied, 3) generation of a 50 member ensemble through perturbation of inputs (rainfall and temperature) and three selected parameters, 4) validation of the ensemble through the compliance of two criteria based on ensemble's spread, mean square error and skill and, 5) Kalman Gain calculation. In this work, comparison of three satellite data rescaling techniques: 1) cumulative distribution Function (CDF) matching, 2) variance matching and 3) linear least square regression was also performed. Results obtained in this study showed slight improvements of hourly Nash-Sutcliffe Efficiency (NSE) in both catchments, with the different rescaling methods evaluated. Larger improvements were found in terms of seasonal simulated volume error reduction.

  2. Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques

    NASA Astrophysics Data System (ADS)

    Elliott, Louie C.

    This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.

  3. Simulate what is measured: next steps towards predictive simulations (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Bussmann, Michael; Kluge, Thomas; Debus, Alexander; Hübl, Axel; Garten, Marco; Zacharias, Malte; Vorberger, Jan; Pausch, Richard; Widera, René; Schramm, Ulrich; Cowan, Thomas E.; Irman, Arie; Zeil, Karl; Kraus, Dominik

    2017-05-01

    Simulations of laser matter interaction at extreme intensities that have predictive power are nowadays in reach when considering codes that make optimum use of high performance compute architectures. Nevertheless, this is mostly true for very specific settings where model parameters are very well known from experiment and the underlying plasma dynamics is governed by Maxwell's equations solely. When including atomic effects, prepulse influences, radiation reaction and other physical phenomena things look different. Not only is it harder to evaluate the sensitivity of the simulation result on the variation of the various model parameters but numerical models are less well tested and their combination can lead to subtle side effects that influence the simulation outcome. We propose to make optimum use of future compute hardware to compute statistical and systematic errors rather than just find the mots optimum set of parameters fitting an experiment. This requires to include experimental uncertainties which is a challenge to current state of the art techniques. Moreover, it demands better comparison to experiments as inclusion of simulating the diagnostic's response becomes important. We strongly advocate the use of open standards for finding interoperability between codes for comparison studies, building complete tool chains for simulating laser matter experiments from start to end.

  4. Molecular Modeling of Nucleic Acid Structure: Electrostatics and Solvation

    PubMed Central

    Bergonzo, Christina; Galindo-Murillo, Rodrigo; Cheatham, Thomas E.

    2014-01-01

    This unit presents an overview of computer simulation techniques as applied to nucleic acid systems, ranging from simple in vacuo molecular modeling techniques to more complete all-atom molecular dynamics treatments that include an explicit representation of the environment. The third in a series of four units, this unit focuses on critical issues in solvation and the treatment of electrostatics. UNITS 7.5 & 7.8 introduced the modeling of nucleic acid structure at the molecular level. This included a discussion of how to generate an initial model, how to evaluate the utility or reliability of a given model, and ultimately how to manipulate this model to better understand the structure, dynamics, and interactions. Subject to an appropriate representation of the energy, such as a specifically parameterized empirical force field, the techniques of minimization and Monte Carlo simulation, as well as molecular dynamics (MD) methods, were introduced as means to sample conformational space for a better understanding of the relevance of a given model. From this discussion, the major limitations with modeling, in general, were highlighted. These are the difficult issues in sampling conformational space effectively—the multiple minima or conformational sampling problems—and accurately representing the underlying energy of interaction. In order to provide a realistic model of the underlying energetics for nucleic acids in their native environments, it is crucial to include some representation of solvation (by water) and also to properly treat the electrostatic interactions. These are discussed in detail in this unit. PMID:18428877

  5. Molecular modeling of nucleic Acid structure: electrostatics and solvation.

    PubMed

    Bergonzo, Christina; Galindo-Murillo, Rodrigo; Cheatham, Thomas E

    2014-12-19

    This unit presents an overview of computer simulation techniques as applied to nucleic acid systems, ranging from simple in vacuo molecular modeling techniques to more complete all-atom molecular dynamics treatments that include an explicit representation of the environment. The third in a series of four units, this unit focuses on critical issues in solvation and the treatment of electrostatics. UNITS 7.5 & 7.8 introduced the modeling of nucleic acid structure at the molecular level. This included a discussion of how to generate an initial model, how to evaluate the utility or reliability of a given model, and ultimately how to manipulate this model to better understand its structure, dynamics, and interactions. Subject to an appropriate representation of the energy, such as a specifically parameterized empirical force field, the techniques of minimization and Monte Carlo simulation, as well as molecular dynamics (MD) methods, were introduced as a way of sampling conformational space for a better understanding of the relevance of a given model. This discussion highlighted the major limitations with modeling in general. When sampling conformational space effectively, difficult issues are encountered, such as multiple minima or conformational sampling problems, and accurately representing the underlying energy of interaction. In order to provide a realistic model of the underlying energetics for nucleic acids in their native environments, it is crucial to include some representation of solvation (by water) and also to properly treat the electrostatic interactions. These subjects are discussed in detail in this unit. Copyright © 2014 John Wiley & Sons, Inc.

  6. Identification of Low Order Equivalent System Models From Flight Test Data

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2000-01-01

    Identification of low order equivalent system dynamic models from flight test data was studied. Inputs were pilot control deflections, and outputs were aircraft responses, so the models characterized the total aircraft response including bare airframe and flight control system. Theoretical investigations were conducted and related to results found in the literature. Low order equivalent system modeling techniques using output error and equation error parameter estimation in the frequency domain were developed and validated on simulation data. It was found that some common difficulties encountered in identifying closed loop low order equivalent system models from flight test data could be overcome using the developed techniques. Implications for data requirements and experiment design were discussed. The developed methods were demonstrated using realistic simulation cases, then applied to closed loop flight test data from the NASA F-18 High Alpha Research Vehicle.

  7. Simulation of bio-locomotion by a momentum redistribution technique for self-propulsion

    NASA Astrophysics Data System (ADS)

    Curet, Oscar; Shirgaonkar, Anup; Patankar, Neelesh; Maciver, Malcolm

    2007-11-01

    We have developed a general purpose computational approach for self-propulsion based on a momentum redistribution concept. In this poster, our primary goal is to show that the technique can simulate swimming of various organisms without using reduced order models for fluid dynamics. The approach fully resolves the motion of the organism and the surrounding fluid. Thus, it is an effective tool to obtain forces, flow fields, as well as the swimming velocity when the deformation kinematics of the organism are available from observational data. We will present images of computational flow fields for several examples including the aquatic locomotion of sperm, jellyfish, eel, and blackghost knifefish. These examples span a range of body configurations, swimming gaits, and Reynolds numbers in their natural environments. Peculiarities of various modes of swimming will be highlighted.

  8. A Comparison of 3D3C Velocity Measurement Techniques

    NASA Astrophysics Data System (ADS)

    La Foy, Roderick; Vlachos, Pavlos

    2013-11-01

    The velocity measurement fidelity of several 3D3C PIV measurement techniques including tomographic PIV, synthetic aperture PIV, plenoptic PIV, defocusing PIV, and 3D PTV are compared in simulations. A physically realistic ray-tracing algorithm is used to generate synthetic images of a standard calibration grid and of illuminated particle fields advected by homogeneous isotropic turbulence. The simulated images for the tomographic, synthetic aperture, and plenoptic PIV cases are then used to create three-dimensional reconstructions upon which cross-correlations are performed to yield the measured velocity field. Particle tracking algorithms are applied to the images for the defocusing PIV and 3D PTV to directly yield the three-dimensional velocity field. In all cases the measured velocity fields are compared to one-another and to the true velocity field using several metrics.

  9. A New Quantum Watermarking Based on Quantum Wavelet Transforms

    NASA Astrophysics Data System (ADS)

    Heidari, Shahrokh; Naseri, Mosayeb; Gheibi, Reza; Baghfalaki, Masoud; Rasoul Pourarian, Mohammad; Farouk, Ahmed

    2017-06-01

    Quantum watermarking is a technique to embed specific information, usually the owner’s identification, into quantum cover data such for copyright protection purposes. In this paper, a new scheme for quantum watermarking based on quantum wavelet transforms is proposed which includes scrambling, embedding and extracting procedures. The invisibility and robustness performances of the proposed watermarking method is confirmed by simulation technique. The invisibility of the scheme is examined by the peak-signal-to-noise ratio (PSNR) and the histogram calculation. Furthermore the robustness of the scheme is analyzed by the Bit Error Rate (BER) and the Correlation Two-Dimensional (Corr 2-D) calculation. The simulation results indicate that the proposed watermarking scheme indicate not only acceptable visual quality but also a good resistance against different types of attack. Supported by Kermanshah Branch, Islamic Azad University, Kermanshah, Iran

  10. Enhancing audiovisual experience with haptic feedback: a survey on HAV.

    PubMed

    Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M

    2013-01-01

    Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.

  11. Dataset for Testing Contamination Source Identification Methods for Water Distribution Networks

    EPA Pesticide Factsheets

    This dataset includes the results of a simulation study using the source inversion techniques available in the Water Security Toolkit. The data was created to test the different techniques for accuracy, specificity, false positive rate, and false negative rate. The tests examined different parameters including measurement error, modeling error, injection characteristics, time horizon, network size, and sensor placement. The water distribution system network models that were used in the study are also included in the dataset. This dataset is associated with the following publication:Seth, A., K. Klise, J. Siirola, T. Haxton , and C. Laird. Testing Contamination Source Identification Methods for Water Distribution Networks. Journal of Environmental Division, Proceedings of American Society of Civil Engineers. American Society of Civil Engineers (ASCE), Reston, VA, USA, ., (2016).

  12. Telerobotic research at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Sliwa, Nancy E.

    1987-01-01

    An overview of Automation Technology Branch facilities and research is presented. Manipulator research includes dual-arm coordination studies, space manipulator dynamics, end-effector controller development, automatic space structure assembly, and the development of a dual-arm master-slave telerobotic manipulator system. Sensor research includes gravity-compensated force control, real-time monovision techniques, and laser ranging. Artificial intelligence techniques are being explored for supervisory task control, collision avoidance, and connectionist system architectures. A high-fidelity dynamic simulation of robotic systems, ROBSIM, is being supported and extended. Cooperative efforts with Oak Ridge National Laboratory have verified the ability of teleoperators to perform complex structural assembly tasks, and have resulted in the definition of a new dual-arm master-slave telerobotic manipulator. A bibliography of research results and a list of technical contacts are included.

  13. Large perturbation flow field analysis and simulation for supersonic inlets

    NASA Technical Reports Server (NTRS)

    Varner, M. O.; Martindale, W. R.; Phares, W. J.; Kneile, K. R.; Adams, J. C., Jr.

    1984-01-01

    An analysis technique for simulation of supersonic mixed compression inlets with large flow field perturbations is presented. The approach is based upon a quasi-one-dimensional inviscid unsteady formulation which includes engineering models of unstart/restart, bleed, bypass, and geometry effects. Numerical solution of the governing time dependent equations of motion is accomplished through a shock capturing finite difference algorithm, of which five separate approaches are evaluated. Comparison with experimental supersonic wind tunnel data is presented to verify the present approach for a wide range of transient inlet flow conditions.

  14. An Initial Multi-Domain Modeling of an Actively Cooled Structure

    NASA Technical Reports Server (NTRS)

    Steinthorsson, Erlendur

    1997-01-01

    A methodology for the simulation of turbine cooling flows is being developed. The methodology seeks to combine numerical techniques that optimize both accuracy and computational efficiency. Key components of the methodology include the use of multiblock grid systems for modeling complex geometries, and multigrid convergence acceleration for enhancing computational efficiency in highly resolved fluid flow simulations. The use of the methodology has been demonstrated in several turbo machinery flow and heat transfer studies. Ongoing and future work involves implementing additional turbulence models, improving computational efficiency, adding AMR.

  15. Transient thermal modeling of the nonscanning ERBE detector

    NASA Technical Reports Server (NTRS)

    Mahan, J. R.

    1983-01-01

    A numerical model to predict the transient thermal response of the ERBE nonscanning wide field of view total radiometer channel was developed. The model, which uses Monte Carlo techniques to characterize the radiative component of heat transfer, is described and a listing of the computer program is provided. Application of the model to simulate the actual blackbody calibration procedure is discussed. The use of the model to establish a real time flight data interpretation strategy is recommended. Modification of the model to include a simulated Earth radiation source field and a filter dome is indicated.

  16. A mathematical simulation model of a 1985-era tilt-rotor passenger aircraft

    NASA Technical Reports Server (NTRS)

    Mcveigh, M. A.; Widdison, C. A.

    1976-01-01

    A mathematical model for use in real-time piloted simulation of a 1985-era tilt rotor passenger aircraft is presented. The model comprises the basic six degrees-of-freedom equations of motion, and a large angle of attack representation of the airframe and rotor aerodynamics, together with equations and functions used to model turbine engine performance, aircraft control system and stability augmentation system. A complete derivation of the primary equations is given together with a description of the modeling techniques used. Data for the model is included in an appendix.

  17. Structure identification methods for atomistic simulations of crystalline materials

    DOE PAGES

    Stukowski, Alexander

    2012-05-28

    Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.

  18. Leveraging simulation to evaluate system performance in presence of fixed pattern noise

    NASA Astrophysics Data System (ADS)

    Teaney, Brian P.

    2017-05-01

    The development of image simulation techniques which map the effects of a notional, modeled sensor system onto an existing image can be used to evaluate the image quality of camera systems prior to the development of prototype systems. In addition, image simulation or `virtual prototyping' can be utilized to reduce the time and expense associated with conducting extensive field trials. In this paper we examine the development of a perception study designed to assess the performance of the NVESD imager performance metrics as a function of fixed pattern noise. This paper discusses the development of the model theory and the implementation and execution of the perception study. In addition, other applications of the image simulation component including the evaluation of limiting resolution and other test targets is provided.

  19. a Virtual Trip to the Schwarzschild-De Sitter Black Hole

    NASA Astrophysics Data System (ADS)

    Bakala, Pavel; Hledík, Stanislav; Stuchlík, Zdenĕk; Truparová, Kamila; Čermák, Petr

    2008-09-01

    We developed realistic fully general relativistic computer code for simulation of optical projection in a strong, spherically symmetric gravitational field. Standard theoretical analysis of optical projection for an observer in the vicinity of a Schwarzschild black hole is extended to black hole spacetimes with a repulsive cosmological constant, i.e, Schwarzschild-de Sitter (SdS) spacetimes. Influence of the cosmological constant is investigated for static observers and observers radially free-falling from static radius. Simulation includes effects of gravitational lensing, multiple images, Doppler and gravitational frequency shift, as well as the amplification of intensity. The code generates images of static observers sky and a movie simulations for radially free-falling observers. Techniques of parallel programming are applied to get high performance and fast run of the simulation code.

  20. Generalized simulation technique for turbojet engine system analysis

    NASA Technical Reports Server (NTRS)

    Seldner, K.; Mihaloew, J. R.; Blaha, R. J.

    1972-01-01

    A nonlinear analog simulation of a turbojet engine was developed. The purpose of the study was to establish simulation techniques applicable to propulsion system dynamics and controls research. A schematic model was derived from a physical description of a J85-13 turbojet engine. Basic conservation equations were applied to each component along with their individual performance characteristics to derive a mathematical representation. The simulation was mechanized on an analog computer. The simulation was verified in both steady-state and dynamic modes by comparing analytical results with experimental data obtained from tests performed at the Lewis Research Center with a J85-13 engine. In addition, comparison was also made with performance data obtained from the engine manufacturer. The comparisons established the validity of the simulation technique.

  1. Lens implementation on the GATE Monte Carlo toolkit for optical imaging simulation

    NASA Astrophysics Data System (ADS)

    Kang, Han Gyu; Song, Seong Hyun; Han, Young Been; Kim, Kyeong Min; Hong, Seong Jong

    2018-02-01

    Optical imaging techniques are widely used for in vivo preclinical studies, and it is well known that the Geant4 Application for Emission Tomography (GATE) can be employed for the Monte Carlo (MC) modeling of light transport inside heterogeneous tissues. However, the GATE MC toolkit is limited in that it does not yet include optical lens implementation, even though this is required for a more realistic optical imaging simulation. We describe our implementation of a biconvex lens into the GATE MC toolkit to improve both the sensitivity and spatial resolution for optical imaging simulation. The lens implemented into the GATE was validated against the ZEMAX optical simulation using an US air force 1951 resolution target. The ray diagrams and the charge-coupled device images of the GATE optical simulation agreed with the ZEMAX optical simulation results. In conclusion, the use of a lens on the GATE optical simulation could improve the image quality of bioluminescence and fluorescence significantly as compared with pinhole optics.

  2. Simulation in laparoscopic surgery.

    PubMed

    León Ferrufino, Felipe; Varas Cohen, Julián; Buckel Schaffner, Erwin; Crovari Eulufi, Fernando; Pimentel Müller, Fernando; Martínez Castillo, Jorge; Jarufe Cassis, Nicolás; Boza Wilson, Camilo

    2015-01-01

    Nowadays surgical trainees are faced with a more reduced surgical practice, due to legal limitations and work hourly constraints. Also, currently surgeons are expected to dominate more complex techniques such as laparoscopy. Simulation emerges as a complementary learning tool in laparoscopic surgery, by training in a safe, controlled and standardized environment, without jeopardizing patient' safety. Simulation' objective is that the skills acquired should be transferred to the operating room, allowing reduction of learning curves. The use of simulation has increased worldwide, becoming an important tool in different surgical residency programs and laparoscopic training courses. For several countries, the approval of these training courses are a prerequisite for the acquisition of surgeon title certifications. This article reviews the most important aspects of simulation in laparoscopic surgery, including the most used simulators and training programs, as well as the learning methodologies and the different key ways to assess learning in simulation. Copyright © 2013 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Word aligned bitmap compression method, data structure, and apparatus

    DOEpatents

    Wu, Kesheng; Shoshani, Arie; Otoo, Ekow

    2004-12-14

    The Word-Aligned Hybrid (WAH) bitmap compression method and data structure is a relatively efficient method for searching and performing logical, counting, and pattern location operations upon large datasets. The technique is comprised of a data structure and methods that are optimized for computational efficiency by using the WAH compression method, which typically takes advantage of the target computing system's native word length. WAH is particularly apropos to infrequently varying databases, including those found in the on-line analytical processing (OLAP) industry, due to the increased computational efficiency of the WAH compressed bitmap index. Some commercial database products already include some version of a bitmap index, which could possibly be replaced by the WAH bitmap compression techniques for potentially increased operation speed, as well as increased efficiencies in constructing compressed bitmaps. Combined together, this technique may be particularly useful for real-time business intelligence. Additional WAH applications may include scientific modeling, such as climate and combustion simulations, to minimize search time for analysis and subsequent data visualization.

  4. Superplastic Forming 40 Years and Still Growing

    NASA Astrophysics Data System (ADS)

    Barnes, A. J.

    2007-08-01

    In late 1964 Backofen, Turner & Avery, at MIT, published a paper in which they described the “extraordinary formability” exhibited when fine-grain zinc-aluminum eutectoid (Zn 22 Al) was subjected to bulge testing under appropriate conditions. They concluded their research findings with the following insightful comment “ even more appealing is the thought of applying to superplastic metals forming techniques borrowed from polymer and glass processing.” Since then their insightful thought has become a substantial reality with thousands of tons of metallic sheet materials now being superplastically formed each year. This paper reviews the significant advances that have taken place over the past 40 years including alloy developments, improved forming techniques and equipment, and an ever increasing number of commercial applications. Current and likely future trends are discussed including; applications in the aerospace and automotive markets, faster-forming techniques to improve productivity, the increasing importance of computer modeling and simulation in tool design and process optimization and new alloy developments including superplastic magnesium alloys.

  5. Hypervelocity Impact Test Facility: A gun for hire

    NASA Technical Reports Server (NTRS)

    Johnson, Calvin R.; Rose, M. F.; Hill, D. C.; Best, S.; Chaloupka, T.; Crawford, G.; Crumpler, M.; Stephens, B.

    1994-01-01

    An affordable technique has been developed to duplicate the types of impacts observed on spacecraft, including the Shuttle, by use of a certified Hypervelocity Impact Facility (HIF) which propels particulates using capacitor driven electric gun techniques. The fully operational facility provides a flux of particles in the 10-100 micron diameter range with a velocity distribution covering the space debris and interplanetary dust particle environment. HIF measurements of particle size, composition, impact angle and velocity distribution indicate that such parameters can be controlled in a specified, tailored test designed for or by the user. Unique diagnostics enable researchers to fully describe the impact for evaluating the 'targets' under full power or load. Users regularly evaluate space hardware, including solar cells, coatings, and materials, exposing selected portions of space-qualified items to a wide range of impact events and environmental conditions. Benefits include corroboration of data obtained from impact events, flight simulation of designs, accelerated aging of systems, and development of manufacturing techniques.

  6. Thought Experiments in Physics Problem-solving: On Intuition and Imagistic Simulation

    ERIC Educational Resources Information Center

    Georgiou, Andreas

    2005-01-01

    This study is part of a larger research agenda, which includes future doctoral study, aiming to investigate the psychological processes of thought experiments. How do thought-experimenters establish relations between their imaginary worlds and the physical one? How does a technique devoid of new sensory input result to new empirical knowledge? In…

  7. Outdoor Biology Instructional Strategies Trial Edition, Set IV.

    ERIC Educational Resources Information Center

    Throgmorton, Larry, Ed.; And Others

    Eight games are included in the 24 activities in the Outdoor Biology Instructional Strategies (OBIS) Trial Edition Set IV. There are also simulations, crafts, biological techniques, and organism investigations focusing on animal and plant life in the forest, desert, and snow. Designed for small groups of children ages 10 to 15 from schools and…

  8. Computing Newsletter for Schools of Business.

    ERIC Educational Resources Information Center

    Couger, J. Daniel, Ed.

    1973-01-01

    The first of the two issues included here reports on various developments concerning the use of computers for schools of business. One-page articles cover these topics: widespread use of simulation games, survey of computer use in higher education, ten new computer cases which teach techniques for management analysis, advantages of the use of…

  9. NOVA HIGH SCHOOL--DESCRIPTION OF TENTH-GRADE SOCIAL STUDIES COURSE.

    ERIC Educational Resources Information Center

    COGSWELL, JOHN F.

    SYSTEMS ANALYSIS AND COMPUTER SIMULATION TECHNIQUES WERE APPLIED IN A STUDY OF INNOVATION FOR A 10TH-GRADE SOCIAL STUDIES COURSE. THE COURSE CONTENT WAS AMERICAN HISTORY WHICH WAS DIVIDED INTO 10 CONTENT AREAS SUCH AS COLONIAL, REVOLUNTIONARY, AND CONSTITUTIONAL AMERICAN. THE ACTIVITIES OF THE COURSE INCLUDED TEAM TEACHING, LECTURES, MEDIA…

  10. Techniques with Tangibles; a Manual for Teaching the Blind.

    ERIC Educational Resources Information Center

    Fulker, Wilber H.; Fulker, Mary

    The production and use of tangible aids for teaching complete mental concepts to the blind are discussed. The Thermoform vacuum duplicating machine which produces teaching aids simulating pictures or drawings used by sighted children is described; and examples of Thermoform masters are cited, including Mendel's law, the maze, four stages of cell…

  11. Using Technology to Meet the Challenges of Medical Education

    PubMed Central

    Guze, Phyllis A.

    2015-01-01

    Medical education is rapidly changing, influenced by many factors including the changing health care environment, the changing role of the physician, altered societal expectations, rapidly changing medical science, and the diversity of pedagogical techniques. Changes in societal expectations put patient safety in the forefront, and raises the ethical issues of learning interactions and procedures on live patients, with the long-standing teaching method of “see one, do one, teach one” no longer acceptable. The educational goals of using technology in medical education include facilitating basic knowledge acquisition, improving decision making, enhancement of perceptual variation, improving skill coordination, practicing for rare or critical events, learning team training, and improving psychomotor skills. Different technologies can address these goals. Technologies such as podcasts and videos with flipped classrooms, mobile devices with apps, video games, simulations (part-time trainers, integrated simulators, virtual reality), and wearable devices (google glass) are some of the techniques available to address the changing educational environment. This article presents how the use of technologies can provide the infrastructure and basis for addressing many of the challenges in providing medical education for the future. PMID:26330687

  12. Using Technology to Meet the Challenges of Medical Education.

    PubMed

    Guze, Phyllis A

    2015-01-01

    Medical education is rapidly changing, influenced by many factors including the changing health care environment, the changing role of the physician, altered societal expectations, rapidly changing medical science, and the diversity of pedagogical techniques. Changes in societal expectations put patient safety in the forefront, and raises the ethical issues of learning interactions and procedures on live patients, with the long-standing teaching method of "see one, do one, teach one" no longer acceptable. The educational goals of using technology in medical education include facilitating basic knowledge acquisition, improving decision making, enhancement of perceptual variation, improving skill coordination, practicing for rare or critical events, learning team training, and improving psychomotor skills. Different technologies can address these goals. Technologies such as podcasts and videos with flipped classrooms, mobile devices with apps, video games, simulations (part-time trainers, integrated simulators, virtual reality), and wearable devices (google glass) are some of the techniques available to address the changing educational environment. This article presents how the use of technologies can provide the infrastructure and basis for addressing many of the challenges in providing medical education for the future.

  13. A Boilerplate Capsule Test Technique for the Orion Parachute Test Program

    NASA Technical Reports Server (NTRS)

    Moore, James W.; Fraire, Usbaldo, Jr.

    2013-01-01

    The test program developing parachutes for the Orion/MPCV includes drop tests of a Parachute Test Vehicle designed to emulate the wake of the Orion capsule. Delivery of this test vehicle to the initial velocity, altitude, and orientation required for the test is a difficult problem involving multiple engineering disciplines. The available delivery of aircraft options imposed constraints on the test vehicle development and concept of operations. This paper describes the development of this test technique. The engineering challenges include the extraction from an aircraft and separation of two aerodynamically unstable vehicles, one of which will be delivered to a specific orientation with reasonably small rates. The desired attitude is achieved by precisely targeting the separation point using on-board monitoring of the motion. The design of the test vehicle is described. The trajectory simulations and other analyses used to develop this technique and predict the behavior of the test article are reviewed in detail. The application of the technique on several successful drop tests is summarized.

  14. Predicting subscriber dissatisfaction and improving retention in the wireless telecommunications industry.

    PubMed

    Mozer, M C; Wolniewicz, R; Grimes, D B; Johnson, E; Kaushansky, H

    2000-01-01

    Competition in the wireless telecommunications industry is fierce. To maintain profitability, wireless carriers must control churn, which is the loss of subscribers who switch from one carrier to another.We explore techniques from statistical machine learning to predict churn and, based on these predictions, to determine what incentives should be offered to subscribers to improve retention and maximize profitability to the carrier. The techniques include logit regression, decision trees, neural networks, and boosting. Our experiments are based on a database of nearly 47,000 U.S. domestic subscribers and includes information about their usage, billing, credit, application, and complaint history. Our experiments show that under a wide variety of assumptions concerning the cost of intervention and the retention rate resulting from intervention, using predictive techniques to identify potential churners and offering incentives can yield significant savings to a carrier. We also show the importance of a data representation crafted by domain experts. Finally, we report on a real-world test of the techniques that validate our simulation experiments.

  15. Description of the GMAO OSSE for Weather Analysis Software Package: Version 3

    NASA Technical Reports Server (NTRS)

    Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.; hide

    2017-01-01

    The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.

  16. [Application of computer-assisted 3D imaging simulation for surgery].

    PubMed

    Matsushita, S; Suzuki, N

    1994-03-01

    This article describes trends in application of various imaging technology in surgical planning, navigation, and computer aided surgery. Imaging information is essential factor for simulation in medicine. It includes three dimensional (3D) image reconstruction, neuro-surgical navigation, creating substantial model based on 3D imaging data and etc. These developments depend mostly on 3D imaging technique, which is much contributed by recent computer technology. 3D imaging can offer new intuitive information to physician and surgeon, and this method is suitable for mechanical control. By utilizing simulated results, we can obtain more precise surgical orientation, estimation, and operation. For more advancement, automatic and high speed recognition of medical imaging is being developed.

  17. The viability of ADVANTG deterministic method for synthetic radiography generation

    NASA Astrophysics Data System (ADS)

    Bingham, Andrew; Lee, Hyoung K.

    2018-07-01

    Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.

  18. Spectral-element simulation of two-dimensional elastic wave propagation in fully heterogeneous media on a GPU cluster

    NASA Astrophysics Data System (ADS)

    Rudianto, Indra; Sudarmaji

    2018-04-01

    We present an implementation of the spectral-element method for simulation of two-dimensional elastic wave propagation in fully heterogeneous media. We have incorporated most of realistic geological features in the model, including surface topography, curved layer interfaces, and 2-D wave-speed heterogeneity. To accommodate such complexity, we use an unstructured quadrilateral meshing technique. Simulation was performed on a GPU cluster, which consists of 24 core processors Intel Xeon CPU and 4 NVIDIA Quadro graphics cards using CUDA and MPI implementation. We speed up the computation by a factor of about 5 compared to MPI only, and by a factor of about 40 compared to Serial implementation.

  19. Summary of investigations of engine response to distorted inlet conditions

    NASA Technical Reports Server (NTRS)

    Biesiadny, T. J.; Braithwaite, W. M.; Soeder, R. H.; Abdelwahab, M.

    1986-01-01

    A survey is presented of experimental and analytical experience of the NASA Lewis Research Center in engine response to inlet temperature and pressure distortions. This includes a description of the hardware and techniques employed, and a summary of the highlights of experimental investigations and analytical modeling. Distortion devices successfully simulated inlet distortion, and knowledge was gained about compression system response to different types of distortion. A list of NASA research references is included.

  20. Continuous All-Optical Deceleration and Single-Photon Cooling of Molecular Beams

    DTIC Science & Technology

    2014-02-21

    PHYSICAL REVIEW A 89 , 023425 (2014) Continuous all-optical deceleration and single-photon cooling of molecular beams A. M. Jayich,1 A. C. Vutha,2 M...details including multilevel numerical simulations of strontium monohydride. These techniques are applicable to a large number of molecular species and...molecules that are considered difficult to directly laser cool—a class that includes many 1050-2947/2014/ 89 (2)/023425(8) 023425-1 ©2014 American

  1. Reconstructing extreme AMOC events through nudging of the ocean surface: a perfect model approach

    NASA Astrophysics Data System (ADS)

    Ortega, Pablo; Guilyardi, Eric; Swingedouw, Didier; Mignot, Juliette; Nguyen, Sébastien

    2017-11-01

    While the Atlantic Meridional Overturning Circulation (AMOC) is thought to be a crucial component of the North Atlantic climate, past changes in its strength are challenging to quantify, and only limited information is available. In this study, we use a perfect model approach with the IPSL-CM5A-LR model to assess the performance of several surface nudging techniques in reconstructing the variability of the AMOC. Special attention is given to the reproducibility of an extreme positive AMOC peak from a preindustrial control simulation. Nudging includes standard relaxation techniques towards the sea surface temperature and salinity anomalies of this target control simulation, and/or the prescription of the wind-stress fields. Surface nudging approaches using standard fixed restoring terms succeed in reproducing most of the target AMOC variability, including the timing of the extreme event, but systematically underestimate its amplitude. A detailed analysis of the AMOC variability mechanisms reveals that the underestimation of the extreme AMOC maximum comes from a deficit in the formation of the dense water masses in the main convection region, located south of Iceland in the model. This issue is largely corrected after introducing a novel surface nudging approach, which uses a varying restoring coefficient that is proportional to the simulated mixed layer depth, which, in essence, keeps the restoring time scale constant. This new technique substantially improves water mass transformation in the regions of convection, and in particular, the formation of the densest waters, which are key for the representation of the AMOC extreme. It is therefore a promising strategy that may help to better constrain the AMOC variability and other ocean features in the models. As this restoring technique only uses surface data, for which better and longer observations are available, it opens up opportunities for improved reconstructions of the AMOC over the last few decades.

  2. Reconstructing extreme AMOC events through nudging of the ocean surface: A perfect model approach

    NASA Astrophysics Data System (ADS)

    Ortega, Pablo; Guilyardi, Eric; Swingedouw, Didier; Mignot, Juliette; Nguyen, Sebastien

    2017-04-01

    While the Atlantic Meridional Overturning Circulation (AMOC) is thought to be a crucial component of the North Atlantic climate and its predictability, past changes in its strength are challenging to quantify, and only limited information is available. In this study, we use a perfect model approach with the IPSL-CM5A-LR model to assess the performance of several surface nudging techniques in reconstructing the variability of the AMOC. Special attention is given to the reproducibility of an extreme positive AMOC peak from a preindustrial control simulation. Nudging includes standard relaxation techniques towards the sea surface temperature and salinity anomalies of this target control simulation, and/or the prescription of the wind-stress fields. Surface nudging approaches using standard fixed restoring terms succeed in reproducing most of the target AMOC variability, including the timing of the extreme event, but systematically underestimate its amplitude. A detailed analysis of the AMOC variability mechanisms reveals that the underestimation of the extreme AMOC maximum comes from a deficit in the formation of the dense water masses in the main convection region, located south of Iceland in the model. This issue is largely corrected after introducing a novel surface nudging approach, which uses a varying restoring coefficient that is proportional to the simulated mixed layer depth, which, in essence, keeps the restoring time scale constant. This new technique substantially improves water mass transformation in the regions of convection, and in particular, the formation of the densest waters, which are key for the representation of the AMOC extreme. It is therefore a promising strategy that may help to better initialize the AMOC variability and other ocean features in the models, and thus improve decadal climate predictions. As this restoring technique only uses surface data, for which better and longer observations are available, it opens up opportunities for improved reconstructions of the AMOC over the last few decades.

  3. Numerical Simulations of the Digital Microfluidic Manipulation of Single Microparticles.

    PubMed

    Lan, Chuanjin; Pal, Souvik; Li, Zhen; Ma, Yanbao

    2015-09-08

    Single-cell analysis techniques have been developed as a valuable bioanalytical tool for elucidating cellular heterogeneity at genomic, proteomic, and cellular levels. Cell manipulation is an indispensable process for single-cell analysis. Digital microfluidics (DMF) is an important platform for conducting cell manipulation and single-cell analysis in a high-throughput fashion. However, the manipulation of single cells in DMF has not been quantitatively studied so far. In this article, we investigate the interaction of a single microparticle with a liquid droplet on a flat substrate using numerical simulations. The droplet is driven by capillary force generated from the wettability gradient of the substrate. Considering the Brownian motion of microparticles, we utilize many-body dissipative particle dynamics (MDPD), an off-lattice mesoscopic simulation technique, in this numerical study. The manipulation processes (including pickup, transport, and drop-off) of a single microparticle with a liquid droplet are simulated. Parametric studies are conducted to investigate the effects on the manipulation processes from the droplet size, wettability gradient, wetting properties of the microparticle, and particle-substrate friction coefficients. The numerical results show that the pickup, transport, and drop-off processes can be precisely controlled by these parameters. On the basis of the numerical results, a trap-free delivery of a hydrophobic microparticle to a destination on the substrate is demonstrated in the numerical simulations. The numerical results not only provide a fundamental understanding of interactions among the microparticle, the droplet, and the substrate but also demonstrate a new technique for the trap-free immobilization of single hydrophobic microparticles in the DMF design. Finally, our numerical method also provides a powerful design and optimization tool for the manipulation of microparticles in DMF systems.

  4. A new reference tip-timing test bench and simulator for blade synchronous and asynchronous vibrations

    NASA Astrophysics Data System (ADS)

    Hajnayeb, Ali; Nikpour, Masood; Moradi, Shapour; Rossi, Gianluca

    2018-02-01

    The blade tip-timing (BTT) measurement technique is at present the most promising technique for monitoring the blades of axial turbines and aircraft engines in operating conditions. It is generally used as an alternative to strain gauges in turbine testing. By conducting a comparison with the standard methods such as those based on strain gauges, one determines that the technique is not intrusive and does not require a complicated installation process. Despite its superiority to other methods, the experimental performance analysis of a new BTT method needs a test stand that includes a reference measurement system (e.g. strain gauges equipped with telemetry or other complex optical measurement systems, like rotating laser Doppler vibrometers). In this article, a new reliable, low-cost BTT test setup is proposed for simulating and analyzing blade vibrations based on kinematic inversion. In the proposed test bench, instead of the blades vibrating, it is the BTT sensor that vibrates. The vibration of the sensor is generated by a shaker and can therefore be easily controlled in terms of frequency, amplitude and waveform shape. The amplitude of vibration excitation is measured by a simple accelerometer. After introducing the components of the simulator, the proposed test bench is used in practice to simulate both synchronous and asynchronous vibration scenarios. Then two BTT methods are used to evaluate the quality of the acquired data. The results demonstrate that the proposed setup is able to generate simulated pulse sequences which are almost the same as those generated by the conventional BTT systems installed around a bladed disk. Moreover, the test setup enables its users to evaluate BTT methods by using a limited number of sensors. This significantly reduces the total costs of the experiments.

  5. The Development and Comparison of Molecular Dynamics Simulation and Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Chen, Jundong

    2018-03-01

    Molecular dynamics is an integrated technology that combines physics, mathematics and chemistry. Molecular dynamics method is a computer simulation experimental method, which is a powerful tool for studying condensed matter system. This technique not only can get the trajectory of the atom, but can also observe the microscopic details of the atomic motion. By studying the numerical integration algorithm in molecular dynamics simulation, we can not only analyze the microstructure, the motion of particles and the image of macroscopic relationship between them and the material, but can also study the relationship between the interaction and the macroscopic properties more conveniently. The Monte Carlo Simulation, similar to the molecular dynamics, is a tool for studying the micro-molecular and particle nature. In this paper, the theoretical background of computer numerical simulation is introduced, and the specific methods of numerical integration are summarized, including Verlet method, Leap-frog method and Velocity Verlet method. At the same time, the method and principle of Monte Carlo Simulation are introduced. Finally, similarities and differences of Monte Carlo Simulation and the molecular dynamics simulation are discussed.

  6. Teaching aseptic technique for central venous access under ultrasound guidance: a randomized trial comparing didactic training alone to didactic plus simulation-based training.

    PubMed

    Latif, Rana K; Bautista, Alexander F; Memon, Saima B; Smith, Elizabeth A; Wang, Chenxi; Wadhwa, Anupama; Carter, Mary B; Akca, Ozan

    2012-03-01

    Our goal was to determine whether simulation combined with didactic training improves sterile technique during ultrasound (US)-guided central venous catheter (CVC) insertion compared with didactic training alone among novices. We hypothesized that novices who receive combined didactic and simulation-based training would perform similarly to experienced residents in aseptic technique, knowledge, and perception of comfort during US-guided CVC insertion on a simulator. Seventy-two subjects were enrolled in a randomized, controlled trial of an educational intervention. Fifty-four novices were randomized into either the didactic group or the simulation combined with didactic group. Both groups received didactic training but the simulation combined with didactic group also received simulation-based CVC insertion training. Both groups were tested by demonstrating US-guided CVC insertion on a simulator. Aseptic technique was scored on 8 steps as "yes/no" and also using a 7-point Likert scale with 7 being "excellent technique" by a rater blinded to subject randomization. After initial testing, the didactic group was offered simulation-based training and retesting. Both groups also took a pre- and posttraining test of knowledge and rated their comfort with US and CVC insertion pre- and posttraining on a 5-point Likert scale. Subsequently, 18 experienced residents also took the test of knowledge, rated their comfort level, and were scored while performing aseptic US-guided CVC insertion using a simulator. The simulation combined with didactic group achieved a 167% (95% confidence interval [CI] 133%-167%) incremental increase in yes/no scores and 115% (CI 112%-127%) incremental increase in Likert scale ratings on aseptic technique compared with novices in the didactic group. Compared with experienced residents, simulation combined with didactic trained novices achieved an increase in aseptic scores with a 33.3% (CI 16.7%-50%) increase in yes/no ratings and a 20% (CI 13.3%-40%) increase in Likert scaled ratings, and scored 2.5-fold higher on the test of knowledge. There was a 3-fold increase in knowledge and 2-fold increase in comfort level among all novices (P < 0.001) after combined didactic and simulation-based training. Simulation combined with didactic training is superior to didactic training alone for acquisition of clinical skills such as US-guided CVC insertion. After combined didactic and simulation-based training, novices can outperform experienced residents in aseptic technique as well as in measurements of knowledge.

  7. Simulation and augmented reality in endovascular neurosurgery: lessons from aviation.

    PubMed

    Mitha, Alim P; Almekhlafi, Mohammed A; Janjua, Major Jameel J; Albuquerque, Felipe C; McDougall, Cameron G

    2013-01-01

    Endovascular neurosurgery is a discipline strongly dependent on imaging. Therefore, technology that improves how much useful information we can garner from a single image has the potential to dramatically assist decision making during endovascular procedures. Furthermore, education in an image-enhanced environment, especially with the incorporation of simulation, can improve the safety of the procedures and give interventionalists and trainees the opportunity to study or perform simulated procedures before the intervention, much like what is practiced in the field of aviation. Here, we examine the use of simulators in the training of fighter pilots and discuss how similar benefits can compensate for current deficiencies in endovascular training. We describe the types of simulation used for endovascular procedures, including virtual reality, and discuss the relevant data on its utility in training. Finally, the benefit of augmented reality during endovascular procedures is discussed, along with future computerized image enhancement techniques.

  8. Simulating Wake Vortex Detection with the Sensivu Doppler Wind Lidar Simulator

    NASA Technical Reports Server (NTRS)

    Ramsey, Dan; Nguyen, Chi

    2014-01-01

    In support of NASA's Atmospheric Environment Safety Technologies NRA research topic on Wake Vortex Hazard Investigation, Aerospace Innovations (AI) investigated a set of techniques for detecting wake vortex hazards from arbitrary viewing angles, including axial perspectives. This technical report describes an approach to this problem and presents results from its implementation in a virtual lidar simulator developed at AI. Threedimensional data volumes from NASA's Terminal Area Simulation System (TASS) containing strong turbulent vortices were used as the atmospheric domain for these studies, in addition to an analytical vortex model in 3-D space. By incorporating a third-party radiative transfer code (BACKSCAT 4), user-defined aerosol layers can be incorporated into atmospheric models, simulating attenuation and backscatter in different environmental conditions and altitudes. A hazard detection algorithm is described that uses a twocomponent spectral model to identify vortex signatures observable from arbitrary angles.

  9. Optical design applications for enhanced illumination performance

    NASA Astrophysics Data System (ADS)

    Gilray, Carl; Lewin, Ian

    1995-08-01

    Nonimaging optical design techniques have been applied in the illumination industry for many years. Recently however, powerful software has been developed which allows accurate simulation and optimization of illumination devices. Wide experience has been obtained in using such design techniques for practical situations. These include automotive lighting where safety is of greatest importance, commercial lighting systems designed for energy efficiency, and numerous specialized applications. This presentation will discuss the performance requirements of a variety of illumination devices. It will further cover design methodology and present a variety of examples of practical applications for enhanced system performance.

  10. Non-Black-Box Simulation from One-Way Functions and Applications to Resettable Security

    DTIC Science & Technology

    2012-11-05

    from 2001, Barak (FOCS’01) introduced a novel non-black-box simulation technique. This technique enabled the construc- tion of new cryptographic...primitives, such as resettably-sound zero-knowledge arguments, that cannot be proven secure using just black-box simulation techniques. The work of Barak ... Barak requires the existence of collision-resistant hash functions, and a very recent result by Bitansky and Paneth (FOCS’12) instead requires the

  11. NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2013-01-01

    The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.

  12. OPTESIM, a Versatile Toolbox for Numerical Simulation of Electron Spin Echo Envelope Modulation (ESEEM) that Features Hybrid Optimization and Statistical Assessment of Parameters

    PubMed Central

    Sun, Li; Hernandez-Guzman, Jessica; Warncke, Kurt

    2009-01-01

    Electron spin echo envelope modulation (ESEEM) is a technique of pulsed-electron paramagnetic resonance (EPR) spectroscopy. The analyis of ESEEM data to extract information about the nuclear and electronic structure of a disordered (powder) paramagnetic system requires accurate and efficient numerical simulations. A single coupled nucleus of known nuclear g value (gN) and spin I=1 can have up to eight adjustable parameters in the nuclear part of the spin Hamiltonian. We have developed OPTESIM, an ESEEM simulation toolbox, for automated numerical simulation of powder two- and three-pulse one-dimensional ESEEM for arbitrary number (N) and type (I, gN) of coupled nuclei, and arbitrary mutual orientations of the hyperfine tensor principal axis systems for N>1. OPTESIM is based in the Matlab environment, and includes the following features: (1) a fast algorithm for translation of the spin Hamiltonian into simulated ESEEM, (2) different optimization methods that can be hybridized to achieve an efficient coarse-to-fine grained search of the parameter space and convergence to a global minimum, (3) statistical analysis of the simulation parameters, which allows the identification of simultaneous confidence regions at specific confidence levels. OPTESIM also includes a geometry-preserving spherical averaging algorithm as default for N>1, and global optimization over multiple experimental conditions, such as the dephasing time ( ) for three-pulse ESEEM, and external magnetic field values. Application examples for simulation of 14N coupling (N=1, N=2) in biological and chemical model paramagnets are included. Automated, optimized simulations by using OPTESIM lead to a convergence on dramatically shorter time scales, relative to manual simulations. PMID:19553148

  13. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500 million per month is being spent on this project, which is scheduled to complete by 2010. NASA project stakeholders participated in determining and managing completion distribution functions produced from PAST. The first result was that project stakeholders improved project completion risk awareness. Secondly, using PAST, mitigation options were analyzed to improve project completion performance and reduce total project cost.

  14. Acoustic analysis of aft noise reduction techniques measured on a subsonic tip speed 50.8 cm (twenty inch) diameter fan. [quiet engine program

    NASA Technical Reports Server (NTRS)

    Stimpert, D. L.; Clemons, A.

    1977-01-01

    Sound data which were obtained during tests of a 50.8 cm diameter, subsonic tip speed, low pressure ratio fan were analyzed. The test matrix was divided into two major investigations: (1) source noise reduction techniques; and (2) aft duct noise reduction with acoustic treatment. Source noise reduction techniques were investigated which include minimizing second harmonic noise by varying vane/blade ratio, variation in spacing, and lowering the Mach number through the vane row to lower fan broadband noise. Treatment in the aft duct which includes flow noise effects, faceplate porosity, rotor OGV treatment, slant cell treatment, and splitter simulation with variable depth on the outer wall and constant thickness treatment on the inner wall was investigated. Variable boundary conditions such as variation in treatment panel thickness and orientation, and mixed porosity combined with variable thickness were examined. Significant results are reported.

  15. Technology development of fabrication techniques for advanced solar dynamic concentrators

    NASA Technical Reports Server (NTRS)

    Richter, Scott W.

    1991-01-01

    The objective of the advanced concentrator program is to develop the technology that will lead to lightweight, highly reflective, accurate, scaleable, and long lived space solar dynamic concentrators. The advanced concentrator program encompasses new and innovative concepts, fabrication techniques, materials selection, and simulated space environmental testing. Fabrication techniques include methods of fabricating the substrates and coating substrate surfaces to produce high quality optical surfaces, acceptable for further coating with vapor deposited optical films. The selected materials to obtain a high quality optical surface include microsheet glass and Eccocoat EP-3 epoxy, with DC-93-500 selected as a candidate silicone adhesive and levelizing layer. The following procedures are defined: cutting, cleaning, forming, and bonding microsheet glass. Procedures are also defined for surface cleaning, and EP-3 epoxy application. The results and analyses from atomic oxygen and thermal cycling tests are used to determine the effects of orbital conditions in a space environment.

  16. Failure detection and isolation analysis of a redundant strapdown inertial measurement unit

    NASA Technical Reports Server (NTRS)

    Motyka, P.; Landey, M.; Mckern, R.

    1981-01-01

    The objective of this study was to define and develop techniques for failure detection and isolation (FDI) algorithms for a dual fail/operational redundant strapdown inertial navigation system are defined and developed. The FDI techniques chosen include provisions for hard and soft failure detection in the context of flight control and navigation. Analyses were done to determine error detection and switching levels for the inertial navigation system, which is intended for a conventional takeoff or landing (CTOL) operating environment. In addition, investigations of false alarms and missed alarms were included for the FDI techniques developed, along with the analyses of filters to be used in conjunction with FDI processing. Two specific FDI algorithms were compared: the generalized likelihood test and the edge vector test. A deterministic digital computer simulation was used to compare and evaluate the algorithms and FDI systems.

  17. Invited article: Dielectric material characterization techniques and designs of high-Q resonators for applications from micro to millimeter-waves frequencies applicable at room and cryogenic temperatures.

    PubMed

    Le Floch, Jean-Michel; Fan, Y; Humbert, Georges; Shan, Qingxiao; Férachou, Denis; Bara-Maillet, Romain; Aubourg, Michel; Hartnett, John G; Madrangeas, Valerie; Cros, Dominique; Blondy, Jean-Marc; Krupka, Jerzy; Tobar, Michael E

    2014-03-01

    Dielectric resonators are key elements in many applications in micro to millimeter wave circuits, including ultra-narrow band filters and frequency-determining components for precision frequency synthesis. Distributed-layered and bulk low-loss crystalline and polycrystalline dielectric structures have become very important for building these devices. Proper design requires careful electromagnetic characterization of low-loss material properties. This includes exact simulation with precision numerical software and precise measurements of resonant modes. For example, we have developed the Whispering Gallery mode technique for microwave applications, which has now become the standard for characterizing low-loss structures. This paper will give some of the most common characterization techniques used in the micro to millimeter wave regime at room and cryogenic temperatures for designing high-Q dielectric loaded cavities.

  18. Application of Contact Mode AFM to Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Giordano, Michael A.; Schmid, Steven R.

    A review of the application of contact mode atomic force microscopy (AFM) to manufacturing processes is presented. A brief introduction to common experimental techniques including hardness, scratch, and wear testing is presented, with a discussion of challenges in the extension of manufacturing scale investigations to the AFM. Differences between the macro- and nanoscales tests are discussed, including indentation size effects and their importance in the simulation of processes such as grinding. The basics of lubrication theory are presented and friction force microscopy is introduced as a method of investigating metal forming lubrication on the nano- and microscales that directly simulates tooling/workpiece asperity interactions. These concepts are followed by a discussion of their application to macroscale industrial manufacturing processes and direct correlations are made.

  19. Study of atmospheric dynamics and pollution in the coastal area of English Channel using clustering technique

    NASA Astrophysics Data System (ADS)

    Sokolov, Anton; Dmitriev, Egor; Delbarre, Hervé; Augustin, Patrick; Gengembre, Cyril; Fourmenten, Marc

    2016-04-01

    The problem of atmospheric contamination by principal air pollutants was considered in the industrialized coastal region of English Channel in Dunkirk influenced by north European metropolitan areas. MESO-NH nested models were used for the simulation of the local atmospheric dynamics and the online calculation of Lagrangian backward trajectories with 15-minute temporal resolution and the horizontal resolution down to 500 m. The one-month mesoscale numerical simulation was coupled with local pollution measurements of volatile organic components, particulate matter, ozone, sulphur dioxide and nitrogen oxides. Principal atmospheric pathways were determined by clustering technique applied to backward trajectories simulated. Six clusters were obtained which describe local atmospheric dynamics, four winds blowing through the English Channel, one coming from the south, and the biggest cluster with small wind speeds. This last cluster includes mostly sea breeze events. The analysis of meteorological data and pollution measurements allows relating the principal atmospheric pathways with local air contamination events. It was shown that contamination events are mostly connected with a channelling of pollution from local sources and low-turbulent states of the local atmosphere.

  20. A Measurement and Simulation Based Methodology for Cache Performance Modeling and Tuning

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    We present a cache performance modeling methodology that facilitates the tuning of uniprocessor cache performance for applications executing on shared memory multiprocessors by accurately predicting the effects of source code level modifications. Measurements on a single processor are initially used for identifying parts of code where cache utilization improvements may significantly impact the overall performance. Cache simulation based on trace-driven techniques can be carried out without gathering detailed address traces. Minimal runtime information for modeling cache performance of a selected code block includes: base virtual addresses of arrays, virtual addresses of variables, and loop bounds for that code block. Rest of the information is obtained from the source code. We show that the cache performance predictions are as reliable as those obtained through trace-driven simulations. This technique is particularly helpful to the exploration of various "what-if' scenarios regarding the cache performance impact for alternative code structures. We explain and validate this methodology using a simple matrix-matrix multiplication program. We then apply this methodology to predict and tune the cache performance of two realistic scientific applications taken from the Computational Fluid Dynamics (CFD) domain.

Top