#### Sample records for addition computer simulations

1. Calculators and Computers: Graphical Addition.

ERIC Educational Resources Information Center

Spero, Samuel W.

1978-01-01

A computer program is presented that generates problem sets involving sketching graphs of trigonometric functions using graphical addition. The students use calculators to sketch the graphs and a computer solution is used to check it. (MP)

2. Probabilistic Fatigue: Computational Simulation

NASA Technical Reports Server (NTRS)

Chamis, Christos C.

2002-01-01

Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

3. Computationally efficient multibody simulations

NASA Technical Reports Server (NTRS)

Ramakrishnan, Jayant; Kumar, Manoj

1994-01-01

Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

4. Computer Modeling and Simulation

SciTech Connect

Pronskikh, V. S.

2014-05-09

Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

5. Accelerator simulation using computers

SciTech Connect

Lee, M.; Zambre, Y.; Corbett, W.

1992-01-01

Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a multi-track'' simulation and analysis code can be used for these applications.

6. Accelerator simulation using computers

SciTech Connect

Lee, M.; Zambre, Y.; Corbett, W.

1992-01-01

Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ``multi-track`` simulation and analysis code can be used for these applications.

7. Computer Simulation of Diffraction Patterns.

ERIC Educational Resources Information Center

Dodd, N. A.

1983-01-01

Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

8. Computer-simulated phacoemulsification

Laurell, Carl-Gustaf; Nordh, Leif; Skarman, Eva; Andersson, Mats; Nordqvist, Per

2001-06-01

Phacoemulsification makes the cataract operation easier for the patient but involves a demanding technique for the surgeon. It is therefore important to increase the quality of surgical training in order to shorten the learning period for the beginner. This should diminish the risks of the patient. We are developing a computer-based simulator for training of phacoemulsification. The simulator is built on a platform that can be used as a basis for several different training simulators. A prototype has been made that has been partly tested by experienced surgeons.

9. Parallel Computing for Brain Simulation.

PubMed

Pastur-Romay, L A; Porto-Pazos, A B; Cedrón, F; Pazos, A

2016-11-04

The human brain is the most complex system in the known universe, but it is the most unknown system. It allows the human beings to possess extraordinary capacities. However, we don´t understand yet how and why most of these capacities are produced. For decades, it have been tried that the computers reproduces these capacities. On one hand, to help understanding the nervous system. On the other hand, to process the data in a more efficient way than before. It is intended to make the computers process the information like the brain does it. The important technological developments and the big multidisciplinary projects have allowed create the first simulation with a number of neurons similar to the human brain neurons number. This paper presents an update review about the main research projects that are trying of simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the actual applications of these works and also the future trends. We have reviewed some works that look for a step forward in Neuroscience and other ones that look for a breakthrough in Computer Science (neuromorphic hardware, machine learning techniques). We summarize the most outstanding characteristics of them and present the latest advances and future plans. In addition, this review remarks the importance of considering not only neurons: the computational models of the brain should include glial cells, given the proven importance of the astrocytes in the information processing.

10. Computer simulation of microstructure

Xu, Ping; Morris, J. W.

1992-11-01

The microstructure that results from a martensitic transformation is largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, it is useful to have computer simulation models that mimic the process. One such model is a finite-element model in which the transforming body is divided into elementary cells that transform when it is energetically favorable to do so. Using the linear elastic theory, the elastic energy of an arbitrary distribution of transformed cells can be calculated, and the elastic strain field can be monitored as the transformation proceeds. In the present article, a model of this type is developed and evaluated by testing its ability to generate the preferred configurations of isolated martensite particles, which can be predicted analytically from the linear elastic theory. Both two- and three-dimensional versions of the model are used. The computer model is in good agreement with analytic theory when the latter predicts single-variant martensite particles. The three-dimensional model also generates twinned martensite in reason- able agreement with the analytic predictions when the fractions of the two variants in the particle are near 0.5. It is less successful in reproducing twinned martensites when one variant is dom- inant; however, in this case, it does produce unusual morphologies, such as “butterfly mar- tensite,” that have been observed experimentally. Neither the analytic theory nor the computer simulation predicts twinned martensites in the two-dimensional transformations considered here, revealing an inherent limitation of studies that are restricted to two dimensions.

11. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

12. Decreased length of stay after addition of healthcare provider in emergency department triage: a comparison between computer-simulated and real-world interventions

PubMed Central

Al-Roubaie, Abdul Rahim; Goldlust, Eric Jonathan

2013-01-01

Objective (1) To determine the effects of adding a provider in triage on average length of stay (LOS) and proportion of patients with >6 h LOS. (2) To assess the accuracy of computer simulation in predicting the magnitude of such effects on these metrics. Methods A group-level quasi-experimental trial comparing the St. Louis Veterans Affairs Medical Center emergency department (1) before intervention, (2) after institution of provider in triage, and discrete event simulation (DES) models of similar (3) ‘before’ and (4) ‘after’ conditions. The outcome measures were daily mean LOS and percentage of patients with LOS >6 h. Results The DES-modelled intervention predicted a decrease in the %6-hour LOS from 19.0% to 13.1%, and a drop in the daily mean LOS from 249 to 200 min (p<0.0001). Following (actual) intervention, the number of patients with LOS >6 h decreased from 19.9% to 14.3% (p<0.0001), with the daily mean LOS decreasing from 247 to 210 min (p<0.0001). Conclusion Physician and mid-level provider coverage at triage significantly reduced emergency department LOS in this setting. DES accurately predicted the magnitude of this effect. These results suggest further work in the generalisability of triage providers and in the utility of DES for predicting quantitative effects of process changes. PMID:22398851

13. Computational Process Modeling for Additive Manufacturing

NASA Technical Reports Server (NTRS)

Bagg, Stacey; Zhang, Wei

2014-01-01

Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

14. Grid computing and biomolecular simulation.

PubMed

Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W

2005-08-15

Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.

15. Software simulator for multiple computer simulation system

NASA Technical Reports Server (NTRS)

1983-01-01

A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

16. Computer Simulation of Mutagenesis.

ERIC Educational Resources Information Center

North, J. C.; Dent, M. T.

1978-01-01

A FORTRAN program is described which simulates point-substitution mutations in the DNA strands of typical organisms. Its objective is to help students to understand the significance and structure of the genetic code, and the mechanisms and effect of mutagenesis. (Author/BB)

17. Computer Simulations: An Integrating Tool.

ERIC Educational Resources Information Center

Bilan, Bohdan J.

This introduction to computer simulations as an integrated learning experience reports on their use with students in grades 5 through 10 using commercial software packages such as SimCity, SimAnt, SimEarth, and Civilization. Students spent an average of 60 hours with the simulation games and reported their experiences each week in a personal log.…

18. Composite Erosion by Computational Simulation

NASA Technical Reports Server (NTRS)

Chamis, Christos C.

2006-01-01

Composite degradation is evaluated by computational simulation when the erosion degradation occurs on a ply-by-ply basis and the degrading medium (device) is normal to the ply. The computational simulation is performed by a multi factor interaction model and by a multi scale and multi physics available computer code. The erosion process degrades both the fiber and the matrix simultaneously in the same slice (ply). Both the fiber volume ratio and the matrix volume ratio approach zero while the void volume ratio increases as the ply degrades. The multi factor interaction model simulates the erosion degradation, provided that the exponents and factor ratios are selected judiciously. Results obtained by the computational composite mechanics show that most composite characterization properties degrade monotonically and approach "zero" as the ply degrades completely.

19. Computational Process Modeling for Additive Manufacturing (OSU)

NASA Technical Reports Server (NTRS)

Bagg, Stacey; Zhang, Wei

2015-01-01

Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

20. Computed tomography characterisation of additive manufacturing materials.

PubMed

Bibb, Richard; Thompson, Darren; Winder, John

2011-06-01

Additive manufacturing, covering processes frequently referred to as rapid prototyping and rapid manufacturing, provides new opportunities in the manufacture of highly complex and custom-fitting medical devices and products. Whilst many medical applications of AM have been explored and physical properties of the resulting parts have been studied, the characterisation of AM materials in computed tomography has not been explored. The aim of this study was to determine the CT number of commonly used AM materials. There are many potential applications of the information resulting from this study in the design and manufacture of wearable medical devices, implants, prostheses and medical imaging test phantoms. A selection of 19 AM material samples were CT scanned and the resultant images analysed to ascertain the materials' CT number and appearance in the images. It was found that some AM materials have CT numbers very similar to human tissues, FDM, SLA and SLS produce samples that appear uniform on CT images and that 3D printed materials show a variation in internal structure.

1. Computer simulation of astrophysical plasmas

NASA Technical Reports Server (NTRS)

Max, Claire E.

1991-01-01

The role of sophisticated numerical models and simulations in the field of plasma astrophysics is discussed. The need for an iteration between microphysics and macrophysics in order for astrophysical plasma physics to produce quantitative results that can be related to astronomical data is stressed. A discussion on computational requirements for simulations of astrophysical plasmas contrasts microscopic plasma simulations with macroscopic system models. An overview of particle-in-cell simulations (PICS) is given and two examples of PICS of astrophysical plasma are discussed including particle acceleration by collisionless shocks in relativistic plasmas and magnetic field reconnection in astrophysical plasmas.

2. Simulating chemistry using quantum computers.

PubMed

Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

2011-01-01

The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

3. Computer simulation of the threshold sensitivity determinations

NASA Technical Reports Server (NTRS)

Gayle, J. B.

1974-01-01

A computer simulation study was carried out to evaluate various methods for determining threshold stimulus levels for impact sensitivity tests. In addition, the influence of a number of variables (initial stimulus level, particular stimulus response curve, and increment size) on the apparent threshold values and on the corresponding population response levels was determined. Finally, a critical review of previous assumptions regarding the stimulus response curve for impact testing is presented in the light of the simulation results.

4. Computer Simulation Of Cyclic Oxidation

NASA Technical Reports Server (NTRS)

Probst, H. B.; Lowell, C. E.

1990-01-01

Computer model developed to simulate cyclic oxidation of metals. With relatively few input parameters, kinetics of cyclic oxidation simulated for wide variety of temperatures, durations of cycles, and total numbers of cycles. Program written in BASICA and run on any IBM-compatible microcomputer. Used in variety of ways to aid experimental research. In minutes, effects of duration of cycle and/or number of cycles on oxidation kinetics of material surveyed.

5. Computer simulation for the growing probability of additional offspring with an advantageous reversal allele in the decoupled continuous-time mutation-selection model

Gill, Wonpyong

2016-01-01

This study calculated the growing probability of additional offspring with the advantageous reversal allele in an asymmetric sharply-peaked landscape using the decoupled continuous-time mutation-selection model. The growing probability was calculated for various population sizes, N, sequence lengths, L, selective advantages, s, fitness parameters, k and measuring parameters, C. The saturated growing probability in the stochastic region was approximately the effective selective advantage, s*, when C≫1/Ns* and s*≪1. The present study suggests that the growing probability in the stochastic region in the decoupled continuous-time mutation-selection model can be described using the theoretical formula for the growing probability in the Moran two-allele model. The selective advantage ratio, which represents the ratio of the effective selective advantage to the selective advantage, does not depend on the population size, selective advantage, measuring parameter and fitness parameter; instead the selective advantage ratio decreases with the increasing sequence length.

6. Computer simulation: A modern day crystal ball?

NASA Technical Reports Server (NTRS)

Sham, Michael; Siprelle, Andrew

1994-01-01

It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

7. Enabling Computational Technologies for Terascale Scientific Simulations

SciTech Connect

Ashby, S.F.

2000-08-24

We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

8. Using Computational Simulations to Confront Students' Mental Models

ERIC Educational Resources Information Center

Rodrigues, R.; Carvalho, P. Simeão

2014-01-01

In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

9. Computer simulation of martensitic transformations

SciTech Connect

Xu, Ping

1993-11-01

The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

10. Biomes computed from simulated climatologies

Claussen, Martin; Esch, Monika

1994-01-01

The biome model of Prentice et al. (1992a) is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fur Meteorologie. This study is undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to failures in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are seen for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced C02 concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting changes in vegetation patterns due to a rapid climate change, the latter simulation has to be taken as a prediction of changes in conditions favourable for the existence of certain biomes, not as a prediction of a future distribution of biomes.[/ab

11. Biomes computed from simulated climatologies

SciTech Connect

Claussen, M.; Esch, M.

1994-01-01

The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

12. Proceedings of the 1990 Summer computer simulation conference

SciTech Connect

Svrcek, B.; McRae, J.

1990-01-01

This book covers simulation methodologies, computer systems and applications that will serve the simulation practitioner for the next decade. Specifically, the simulation applications range from Computer-Integrated-Manufacturing, Computer-Aided-Design, Radar and Communications, Transportation, Biomedical, Energy and the Environment, Government/Management and Social Sciences, and Training Simulators to Aerospace, Missiles and SDI. Additionally, new approaches to simulation are offered by neural networks, expert systems and parallel processing. Two applications deal with these new approaches, Intelligent Simulation Environments and Advanced Information Processing and Simulation.

13. Inversion based on computational simulations

SciTech Connect

Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

1998-09-01

A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

14. Computer simulation in mechanical spectroscopy

Blanter, M. S.

2012-09-01

Several examples are given for use of computer simulation in mechanical spectroscopy. On one hand simulation makes it possible to study relaxation mechanisms, and on the other hand to use the colossal accumulation of experimental material to study metals and alloys. The following examples are considered: the effect of Al atom ordering on the Snoek carbon peak in alloys of the system Fe - Al - C; the effect of plastic strain on Finkel'shtein - Rozin relaxation in Fe - Ni - C austenitic steel; checking the adequacy of energy interactions of interstitial atoms, calculated on the basis of a first-principle model by simulation of the concentration dependence of Snoek relaxation parameters in Nb - O.

15. Displaying Computer Simulations Of Physical Phenomena

NASA Technical Reports Server (NTRS)

Watson, Val

1991-01-01

Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

16. Computer Simulation of Martensitic Transformations.

Rifkin, Jonathan A.

This investigation attempted to determine the mechanism of martensitic nucleation by employing computer molecular dynamics; simulations were conducted of various lattices defects to see if they can serve as nucleation sites. As a prerequisite to the simulations the relation between transformation properties and interatomic potential was studied. It was found that the interatomic potential must have specific properties to successfully simulate solid-solid transformations; in particular it needs a long range oscillating tail. We've also studied homogeneous transformations between BCC and FCC structures and concluded it is unlikely that any has a lower energy barrier energy than the Bain transformation. A two dimensional solid was modelled first to gain experience on a relatively simple system; the transformation was from a square lattice to a triangular one. Next a three dimensional system was studied whose interatomic potential was chosen to mimic sodium. Because of the low transition temperature (18K) the transformation from the low temperature phase to high temperature phase was studied (FCC to BCC). The two dimensional system displayed many phenomena characteristic of real martensitic systems: defects promoted nucleation, the martensite grew in plates, some plates served to nucleate new plates (autocatalytic nucleation) and some defects gave rise to multiple plates (butterfly martensite). The three dimensional system did not undergo a permanent martensitic transformation but it did show signs of temporary transformations where some martensite formed and then dissipated. This happened following the dissociation of a screw dislocation into two partial dislocations.

17. Priority Queues for Computer Simulations

NASA Technical Reports Server (NTRS)

Steinman, Jeffrey S. (Inventor)

1998-01-01

The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.

18. Computer Simulation for Emergency Incident Management

SciTech Connect

Brown, D L

2004-12-03

This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

19. Computer Simulation of Radial Immunodiffusion

PubMed Central

Trautman, Rodes

1972-01-01

Theories of diffusion with chemical reaction are reviewed as to their contributions toward developing an algorithm needed for computer simulation of immunodiffusion. The Spiers-Augustin moving sink and the Engelberg stationary sink theories show how the antibody-antigen reaction can be incorporated into boundary conditions of the free diffusion differential equations. For this, a stoichiometric precipitate was assumed and the location of precipitin lines could be predicted. The Hill simultaneous linear adsorption theory provides a mathematical device for including another special type of antibody-antigen reaction in antigen excess regions of the gel. It permits an explanation for the lowered antigen diffusion coefficient, observed in the Oudin arrangement of single linear diffusion, but does not enable prediction of the location of precipitin lines. The most promising mathematical approach for a general solution is implied in the Augustin alternating cycle theory. This assumes the immunodiffusion process can be evaluated by alternating computation cycles: free diffusion without chemical reaction and chemical reaction without diffusion. The algorithm for the free diffusion update cycle, extended to both linear and radial geometries, is given in detail since it was based on gross flow rather than more conventional expressions in terms of net flow. Limitations on the numerical integration process using this algorithm are illustrated for free diffusion from a cylindrical well. PMID:4629869

20. Computer simulation studies of minerals

Oganov, Artem Romaevich

Applications of state-of-the-art computer simulations to important Earth- and rock-forming minerals (Al2SiO5 polymorphs, albite (NaAlSi3O8), and MgSiO3 perovskite) are described. Detailed introductions to equations of state and elasticity, phase transitions, computer simulations, and geophysical background are given. A new general classification of phase transitions is proposed, providing a natural framework for discussion of structural, thermodynamic, and kinetic aspects of phase transitions. The concept of critical bond distances is introduced. For Si-O bonds this critical distance is 2.25 A. Using atomistic simulations, anomalous Al-Si antiordering in albite is explained. A first-order isosymmetric transition associated with a change in the ordering scheme is predicted at high pressures. A quantum-mechanical study is presented for the Al2SiO5 polymorphs: kyanite, andalusite, sillimanite, and hypothetical pseudobrookite-like and V3O5-like phases (the latter phase was believed to be the main Al mineral of the lower mantle). It is shown that above 11 GPa all the Al2SiO5 phases break down into the mixture of oxides: corundum (Al2O3) and stishovite (SiO2). Atomisation energies, crystal structures and equations of state of all the Al2SiO5 polymorphs, corundum, stishovite, quartz (SiO2) have been determined. Metastable pressure-induced transitions in sillimanite and andalusite are predicted at ~30-50 GPa and analysed in terms of structural changes and lattice dynamics. Sillimanite (Pbnm) transforms into incommensurate and isosymmetric (Pbnm) phases; andalusite undergoes pressure-induced amorphisation. Accurate quantum-mechanical thermal equation of state is obtained for MgSiO3 perovskite, the main Earth-forming mineral. Results imply that a pure-perovskite mantle is unlikely. I show that MgSiO3 perovskite is not a Debye-like solid, contrary to a common assumption. First ever ab initio molecular dynamics calculations of elastic constants at finite temperatures are

1. Computer-Graphical Simulation Of Robotic Welding

NASA Technical Reports Server (NTRS)

Fernandez, Ken; Cook, George

1988-01-01

Computer program ROBOSIM, developed to simulate operations of robots, applied to preliminary design of robotic arc-welding operation. Limitations on equipment investigated in advance to prevent expensive mistakes. Computer makes drawing of robotic welder and workpiece on positioning table. Such numerical simulation used to perform rapid, safe experiments in computer-aided design or manufacturing.

2. Proceedings of the 1991 summer computer simulation conference

SciTech Connect

Pace, D.

1991-01-01

This book covers the following topics in computer simulation: validation, languages, algorithms, computer performance and advanced processing, intelligent simulation, simulations in power and propulsion systems, and biomedical simulations.

3. The Shuttle Mission Simulator computer generated imagery

NASA Technical Reports Server (NTRS)

Henderson, T. H.

1984-01-01

Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.

4. Development of simulation computer complex specification

NASA Technical Reports Server (NTRS)

1973-01-01

The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

5. Protocols for Handling Messages Between Simulation Computers

NASA Technical Reports Server (NTRS)

Balcerowski, John P.; Dunnam, Milton

2006-01-01

Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

6. Simulating Drosophila Genetics with the Computer.

ERIC Educational Resources Information Center

Small, James W., Jr.; Edwards, Kathryn L.

1979-01-01

Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)

7. Monte Carlo Computer Simulation of a Rainbow.

ERIC Educational Resources Information Center

Olson, Donald; And Others

1990-01-01

Discusses making a computer-simulated rainbow using principles of physics, such as reflection and refraction. Provides BASIC program for the simulation. Appends a program illustrating the effects of dispersion of the colors. (YP)

8. Computer Based Simulation of Laboratory Experiments.

ERIC Educational Resources Information Center

Edward, Norrie S.

1997-01-01

Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

9. Parallel Proximity Detection for Computer Simulations

NASA Technical Reports Server (NTRS)

Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

1998-01-01

The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

10. Parallel Proximity Detection for Computer Simulation

NASA Technical Reports Server (NTRS)

Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

1997-01-01

The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

11. Computer Simulation of Colliding Galaxies

NASA Video Gallery

Simulation of the formation of the galaxy known as "The Mice." The simulation depicts the merger of two spiral galaxies, pausing and rotating at the stage resembling the Hubble Space Telescope Adva...

12. Computer Simulation in Undergraduate Instruction: A Symposium.

ERIC Educational Resources Information Center

Street, Warren R.; And Others

These symposium papers discuss the instructional use of computers in psychology, with emphasis on computer-produced simulations. The first, by Rich Edwards, briefly outlines LABSIM, a general purpose system of FORTRAN programs which simulate data collection in more than a dozen experimental models in psychology and are designed to train students…

13. A computer simulation of chromosomal instability

Goodwin, E.; Cornforth, M.

The transformation of a normal cell into a cancerous growth can be described as a process of mutation and selection occurring within the context of clonal expansion. Radiation, in addition to initial DNA damage, induces a persistent and still poorly understood genomic instability process that contributes to the mutational burden. It will be essential to include a quantitative description of this phenomenon in any attempt at science-based risk assessment. Monte Carlo computer simulations are a relatively simple way to model processes that are characterized by an element of randomness. A properly constructed simulation can capture the essence of a phenomenon that, as is often the case in biology, can be extraordinarily complex, and can do so even though the phenomenon itself is incompletely understood. A simple computer simulation of one manifestation of genomic instability known as chromosomal instability will be presented. The model simulates clonal expansion of a single chromosomally unstable cell into a colony. Instability is characterized by a single parameter, the rate of chromosomal rearrangement. With each new chromosome aberration, a unique subclone arises (subclones are defined as having a unique karyotype). The subclone initially has just one cell, but it can expand with cell division if the aberration is not lethal. The computer program automatically keeps track of the number of subclones within the expanding colony, and the number of cells within each subclone. Because chromosome aberrations kill some cells during colony growth, colonies arising from unstable cells tend to be smaller than those arising from stable cells. For any chosen level of instability, the computer program calculates the mean number of cells per colony averaged over many runs. These output should prove useful for investigating how such radiobiological phenomena as slow growth colonies, increased doubling time, and delayed cell death depend on chromosomal instability. Also of

14. Multiscale Computer Simulation of Failure in Aerogels

NASA Technical Reports Server (NTRS)

Good, Brian S.

2008-01-01

Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

15. Computationally Lightweight Air-Traffic-Control Simulation

NASA Technical Reports Server (NTRS)

Knight, Russell

2005-01-01

An algorithm for computationally lightweight simulation of automated air traffic control (ATC) at a busy airport has been derived. The algorithm is expected to serve as the basis for development of software that would be incorporated into flight-simulator software, the ATC component of which is not yet capable of handling realistic airport loads. Software based on this algorithm could also be incorporated into other computer programs that simulate a variety of scenarios for purposes of training or amusement.

16. Computer Simulation of the Neuronal Action Potential.

ERIC Educational Resources Information Center

Solomon, Paul R.; And Others

1988-01-01

A series of computer simulations of the neuronal resting and action potentials are described. Discusses the use of simulations to overcome the difficulties of traditional instruction, such as blackboard illustration, which can only illustrate these events at one point in time. Describes systems requirements necessary to run the simulations.…

17. Computer Clinical Simulations in Health Sciences.

ERIC Educational Resources Information Center

Jones, Gary L; Keith, Kenneth D.

1983-01-01

Discusses the key characteristics of clinical simulation, some developmental foundations, two current research studies, and some implications for the future of health science education. Investigations of the effects of computer-based simulation indicate that acquisition of decision-making skills is greater than with noncomputerized simulations.…

18. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

NASA Technical Reports Server (NTRS)

Beshears, Ronald D.

2016-01-01

Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

19. FEL Simulation Using Distributed Computing

SciTech Connect

Einstein, Joshua; Bernabeu Altayo, Gerard; Biedron, Sandra; Freund, Henry; Milton, Stephen; van der Slot, Peter

2016-06-01

While simulation tools are available and have been used regularly for simulating light sources, the increasing availability and lower cost of GPU-based processing opens up new opportunities. This poster highlights a method of how accelerating and parallelizing code processing through the use of COTS software interfaces.

20. Filtration theory using computer simulations

SciTech Connect

Bergman, W.; Corey, I.

1997-08-01

We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

1. Evaluation of Visual Computer Simulator for Computer Architecture Education

ERIC Educational Resources Information Center

Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

2013-01-01

This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

2. Computational Spectrum of Agent Model Simulation

SciTech Connect

Perumalla, Kalyan S

2010-01-01

The study of human social behavioral systems is finding renewed interest in military, homeland security and other applications. Simulation is the most generally applied approach to studying complex scenarios in such systems. Here, we outline some of the important considerations that underlie the computational aspects of simulation-based study of human social systems. The fundamental imprecision underlying questions and answers in social science makes it necessary to carefully distinguish among different simulation problem classes and to identify the most pertinent set of computational dimensions associated with those classes. We identify a few such classes and present their computational implications. The focus is then shifted to the most challenging combinations in the computational spectrum, namely, large-scale entity counts at moderate to high levels of fidelity. Recent developments in furthering the state-of-the-art in these challenging cases are outlined. A case study of large-scale agent simulation is provided in simulating large numbers (millions) of social entities at real-time speeds on inexpensive hardware. Recent computational results are identified that highlight the potential of modern high-end computing platforms to push the envelope with respect to speed, scale and fidelity of social system simulations. Finally, the problem of shielding the modeler or domain expert from the complex computational aspects is discussed and a few potential solution approaches are identified.

3. Computer simulation of upset welding

SciTech Connect

Spingarn, J R; Mason, W E; Swearengen, J C

1982-04-01

Useful process modeling of upset welding requires contributions from metallurgy, welding engineering, thermal analysis and experimental mechanics. In this report, the significant milestones for such an effort are outlined and probable difficult areas are pointed out. Progress to date is summarized and directions for future research are offered. With regard to the computational aspects of this problem, a 2-D heat conduction computer code has been modified to incorporate electrical heating, and computations have been run for an axisymmetric problem with simple viscous material laws and d.c. electrical boundary conditions. In the experimental endeavor, the boundary conditions have been measured during the welding process, although interpretation of voltage drop measurements is not straightforward. The ranges of strain, strain rate and temperature encountered during upset welding have been measured or calculated, and the need for a unifying constitutive law is described. Finally, the possible complications of microstructure and interfaces are clarified.

4. Teaching by Simulation with Personal Computers.

ERIC Educational Resources Information Center

Randall, James E.

1978-01-01

Describes the use of a small digital computer to simulate a peripheral nerve demonstration in which the action potential responses to pairs of stimuli are used to illustrate the properties of excitable membranes. (Author/MA)

5. Augmented Reality Simulations on Handheld Computers

ERIC Educational Resources Information Center

Squire, Kurt; Klopfer, Eric

2007-01-01

Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

6. Computer Simulation of Community Mental Health Centers.

ERIC Educational Resources Information Center

Cox, Gary B.; And Others

1985-01-01

Describes an ongoing research project designed to develop a computer model capable of simulating the service delivery activities of community mental health care centers and human service agencies. The goal and methodology of the project are described. (NB)

7. Computer Simulation of NMR Spectra.

ERIC Educational Resources Information Center

Ellison, A.

1983-01-01

Describes a PASCAL computer program which provides interactive analysis and display of high-resolution nuclear magnetic resonance (NMR) spectra from spin one-half nuclei using a hard-copy or monitor. Includes general and theoretical program descriptions, program capability, and examples of its use. (Source for program/documentation is included.)…

8. The addition of computer simulated noise to investigate radiation dose and image quality in images with spatial correlation of statistical noise: an example application to X-ray CT of the brain.

PubMed

Britten, A J; Crotty, M; Kiremidjian, H; Grundy, A; Adam, E J

2004-04-01

This study validates a method to add spatially correlated statistical noise to an image, applied to transaxial X-ray CT images of the head to simulate exposure reduction by up to 50%. 23 patients undergoing routine head CT had three additional slices acquired for validation purposes, two at the same clinical 420 mAs exposure and one at 300 mAs. Images at the level of the cerebrospinal fluid filled ventricles gave readings of noise from a single image, with subtraction of image pairs to obtain noise readings from non-uniform tissue regions. The spatial correlation of the noise was determined and added to the acquired 420 mAs image to simulate images at 340 mAs, 300 mAs, 260 mAs and 210 mAs. Two radiologists assessed the images, finding little difference between the 300 mAs simulated and acquired images. The presence of periventricular low density lesions (PVLD) was used as an example of the effect of simulated dose reduction on diagnostic accuracy, and visualization of the internal capsule was used as a measure of image quality. Diagnostic accuracy for the diagnosis of PVLD did not fall significantly even down to 210 mAs, though visualization of the internal capsule was poorer at lower exposure. Further work is needed to investigate means of measuring statistical noise without the need for uniform tissue areas, or image pairs. This technique has been shown to allow sufficiently accurate simulation of dose reduction and image quality degradation, even when the statistical noise is spatially correlated.

9. Psychology on Computers: Simulations, Experiments and Projects.

ERIC Educational Resources Information Center

Belcher, Duane M.; Smith, Stephen D.

PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

10. Computer-simulated phacoemulsification improvements

Soederberg, Per G.; Laurell, Carl-Gustaf; Artzen, D.; Nordh, Leif; Skarman, Eva; Nordqvist, P.; Andersson, Mats

2002-06-01

A simulator for phacoemulsification cataract extraction is developed. A three-dimensional visual interface and foot pedals for phacoemulsification power, x-y positioning, zoom and focus were established. An algorithm that allows real time visual feedback of the surgical field was developed. Cataract surgery is the most common surgical procedure. The operation requires input from both feet and both hands and provides visual feedback through the operation microscope essentially without tactile feedback. Experience demonstrates that the number of complications for an experienced surgeon learning phacoemulsification, decreases exponentially, reaching close to the asymptote after the first 500 procedures despite initial wet lab training on animal eyes. Simulator training is anticipated to decrease training time, decrease complication rate for the beginner and reduce expensive supervision by a high volume surgeon.

11. [Animal experimentation, computer simulation and surgical research].

PubMed

Carpentier, Alain

2009-11-01

We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

12. Criterion Standards for Evaluating Computer Simulation Courseware.

ERIC Educational Resources Information Center

Wholeben, Brent Edward

This paper explores the role of computerized simulations as a decision-modeling intervention strategy, and views the strategy's different attribute biases based upon the varying primary missions of instruction versus application. The common goals associated with computer simulations as a training technique are discussed and compared with goals of…

13. Simulations of Probabilities for Quantum Computing

NASA Technical Reports Server (NTRS)

Zak, M.

1996-01-01

It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

14. Salesperson Ethics: An Interactive Computer Simulation

ERIC Educational Resources Information Center

Castleberry, Stephen

2014-01-01

A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

15. Computer Simulation Of A Small Turboshaft Engine

NASA Technical Reports Server (NTRS)

Ballin, Mark G.

1991-01-01

Component-type mathematical model of small turboshaft engine developed for use in real-time computer simulations of dynamics of helicopter flight. Yields shaft speeds, torques, fuel-consumption rates, and other operating parameters with sufficient accuracy for use in real-time simulation of maneuvers involving large transients in power and/or severe accelerations.

16. Understanding Islamist political violence through computational social simulation

SciTech Connect

Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G; Eberhardt, Ariane; Stradling, Seth G

2008-01-01

Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

17. Biocellion: accelerating computer simulation of multicellular biological system models

PubMed Central

Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

2014-01-01

Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

18. Computer simulation of gear tooth manufacturing processes

NASA Technical Reports Server (NTRS)

Mavriplis, Dimitri; Huston, Ronald L.

1990-01-01

The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

19. Cluster computing software for GATE simulations.

PubMed

De Beenhouwer, Jan; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R

2007-06-01

Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

20. Polymer Composites Corrosive Degradation: A Computational Simulation

NASA Technical Reports Server (NTRS)

Chamis, Christos C.; Minnetyan, Levon

2007-01-01

A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

1. Computer Code for Nanostructure Simulation

NASA Technical Reports Server (NTRS)

Filikhin, Igor; Vlahovic, Branislav

2009-01-01

Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

2. Computer simulation of bubble formation.

SciTech Connect

Insepov, Z.; Bazhirov, T.; Norman, G.; Stegailov, V.; Mathematics and Computer Science; Institute for High Energy Densities of Joint Institute for High Temperatures of RAS

2007-01-01

Properties of liquid metals (Li, Pb, Na) containing nanoscale cavities were studied by atomistic Molecular Dynamics (MD). Two atomistic models of cavity simulation were developed that cover a wide area in the phase diagram with negative pressure. In the first model, the thermodynamics of cavity formation, stability and the dynamics of cavity evolution in bulk liquid metals have been studied. Radial densities, pressures, surface tensions, and work functions of nano-scale cavities of various radii were calculated for liquid Li, Na, and Pb at various temperatures and densities, and at small negative pressures near the liquid-gas spinodal, and the work functions for cavity formation in liquid Li were calculated and compared with the available experimental data. The cavitation rate can further be obtained by using the classical nucleation theory (CNT). The second model is based on the stability study and on the kinetics of cavitation of the stretched liquid metals. A MD method was used to simulate cavitation in a metastable Pb and Li melts and determine the stability limits. States at temperatures below critical (T < 0.5Tc) and large negative pressures were considered. The kinetic boundary of liquid phase stability was shown to be different from the spinodal. The kinetics and dynamics of cavitation were studied. The pressure dependences of cavitation frequencies were obtained for several temperatures. The results of MD calculations were compared with estimates based on classical nucleation theory.

3. Creating science simulations through Computational Thinking Patterns

Basawapatna, Ashok Ram

Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

4. Computer Simulation of Animal Navigation

DTIC Science & Technology

1988-07-01

simply examining the gaits employed by animals. The preliminary model of information processing mechanisms used by goats in solving terrain problems...addition, distinct changes were seen in the gaits used by goats before and after encountering the compliant terrain. Results obtained from these...is performed on a local scale. The complex terrain environment was represented by a fractal array, in which each element value represents elevation

5. Computer simulation of space station computer steered high gain antenna

NASA Technical Reports Server (NTRS)

Beach, S. W.

1973-01-01

The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.

6. COMPARISON OF CLASSIFICATION STRATEGIES BY COMPUTER SIMULATION METHODS.

DTIC Science & Technology

NAVAL TRAINING, COMPUTER PROGRAMMING), (*NAVAL PERSONNEL, CLASSIFICATION), SELECTION, SIMULATION, CORRELATION TECHNIQUES , PROBABILITY, COSTS, OPTIMIZATION, PERSONNEL MANAGEMENT, DECISION THEORY, COMPUTERS

7. Flow simulation and high performance computing

Tezduyar, T.; Aliabadi, S.; Behr, M.; Johnson, A.; Kalro, V.; Litke, M.

1996-10-01

Flow simulation is a computational tool for exploring science and technology involving flow applications. It can provide cost-effective alternatives or complements to laboratory experiments, field tests and prototyping. Flow simulation relies heavily on high performance computing (HPC). We view HPC as having two major components. One is advanced algorithms capable of accurately simulating complex, real-world problems. The other is advanced computer hardware and networking with sufficient power, memory and bandwidth to execute those simulations. While HPC enables flow simulation, flow simulation motivates development of novel HPC techniques. This paper focuses on demonstrating that flow simulation has come a long way and is being applied to many complex, real-world problems in different fields of engineering and applied sciences, particularly in aerospace engineering and applied fluid mechanics. Flow simulation has come a long way because HPC has come a long way. This paper also provides a brief review of some of the recently-developed HPC methods and tools that has played a major role in bringing flow simulation where it is today. A number of 3D flow simulations are presented in this paper as examples of the level of computational capability reached with recent HPC methods and hardware. These examples are, flow around a fighter aircraft, flow around two trains passing in a tunnel, large ram-air parachutes, flow over hydraulic structures, contaminant dispersion in a model subway station, airflow past an automobile, multiple spheres falling in a liquid-filled tube, and dynamics of a paratrooper jumping from a cargo aircraft.

8. Numerical characteristics of quantum computer simulation

Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

2016-12-01

The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

9. Airport Simulations Using Distributed Computational Resources

NASA Technical Reports Server (NTRS)

McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

2002-01-01

The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

10. Research in computer simulation of integrated circuits

Newton, A. R.; Pdederson, D. O.

1983-07-01

The performance of the new LSI simulator CLASSIE is evaluated on several circuits with a few hundred to over one thousand semiconductor devices. A more accurate run time prediction formula has been found to be appropriate for circuit simulators. The design decisions for optimal performance under the constraints of the hardware (CRAY-1) are presented. Vector computers have an increased potential for fast, accurate simulation at the transistor level of Large-Scale-Integrated Circuits. Design considerations for a new circuit simulator are developed based on the specifics of the vector computer architecture and of LSI circuits. The simulation of Large-Scale-Integrated (LSI) circuits requires very long run time on conventional circuit analysis programs such as SPICE2 and super-mini computers. A new simulator for LSI circuits, CLASSIE, which takes advantage of circuit hierarchy and repetitiveness, and array processors capable of high-speed floating-point computation are a promising combination. While a large number of powerful design verfication tools have been developed for IC design at the transistor and logic gate levels, there are very few silicon-oriented tools for architectural design and evaluation.

11. Computational methods for coupling microstructural and micromechanical materials response simulations

SciTech Connect

HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

2000-04-01

Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

12. Computer Series, 108. Computer Simulation of Chemical Equilibrium.

ERIC Educational Resources Information Center

Cullen, John F., Jr.

1989-01-01

Presented is a computer simulation called "The Great Chemical Bead Game" which can be used to teach the concepts of equilibrium and kinetics to introductory chemistry students more clearly than through an experiment. Discussed are the rules of the game, the application of rate laws and graphical analysis. (CW)

13. Computer simulation of breathing systems for divers

SciTech Connect

Sexton, P.G.; Nuckols, M.L.

1983-02-01

A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

14. Simulation methods for advanced scientific computing

SciTech Connect

Booth, T.E.; Carlson, J.A.; Forster, R.A.

1998-11-01

This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

15. Structural Composites Corrosive Management by Computational Simulation

NASA Technical Reports Server (NTRS)

Chamis, Christos C.; Minnetyan, Levon

2006-01-01

A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

16. Combining Cases and Computer Simulations in Strategic Management Courses

ERIC Educational Resources Information Center

Mitchell, Rex C.

2004-01-01

In this study, the author compared the effectiveness of two different strategic management course designs: one centered on case discussions and the other combining a computer-based simulation with some cases. In addition to evaluation of the research literature, the study involved experiments with six course sections composed of 130 students, Both…

17. Traffic simulations on parallel computers using domain decomposition techniques

SciTech Connect

Hanebutte, U.R.; Tentner, A.M.

1995-12-31

Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.

18. A computer management system for patient simulations.

PubMed

Finkelsteine, M W; Johnson, L A; Lilly, G E

1991-04-01

A series of interactive videodisc patient simulations is being used to teach clinical problem-solving skills, including diagnosis and management, to dental students. This series is called Oral Disease Simulations for Diagnosis and Management (ODSDM). A computer management system has been developed in response to the following needs. First, the sequence in which students perform simulations is critical. Second, maintaining records of completed simulations and student performance on each simulation is a time-consuming task for faculty. Third, the simulations require ongoing evaluation to ensure high quality instruction. The primary objective of the management system is to ensure that each student masters diagnosis. Mastery must be obtained at a specific level before advancing to the next level. The management system does this by individualizing the sequence of the simulations to adapt to the needs of each student. The management system generates reports which provide information about students or the simulations. Student reports contain demographic and performance information. System reports include information about individual patient simulations and act as a quality control mechanism for the simulations.

19. Perspective: Computer simulations of long time dynamics

SciTech Connect

Elber, Ron

2016-02-14

Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.

20. Perspective: Computer simulations of long time dynamics

PubMed Central

Elber, Ron

2016-01-01

Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

1. Simulating physical phenomena with a quantum computer

Ortiz, Gerardo

2003-03-01

In a keynote speech at MIT in 1981 Richard Feynman raised some provocative questions in connection to the exact simulation of physical systems using a special device named a ``quantum computer'' (QC). At the time it was known that deterministic simulations of quantum phenomena in classical computers required a number of resources that scaled exponentially with the number of degrees of freedom, and also that the probabilistic simulation of certain quantum problems were limited by the so-called sign or phase problem, a problem believed to be of exponential complexity. Such a QC was intended to mimick physical processes exactly the same as Nature. Certainly, remarks coming from such an influential figure generated widespread interest in these ideas, and today after 21 years there are still some open questions. What kind of physical phenomena can be simulated with a QC?, How?, and What are its limitations? Addressing and attempting to answer these questions is what this talk is about. Definitively, the goal of physics simulation using controllable quantum systems (``physics imitation'') is to exploit quantum laws to advantage, and thus accomplish efficient imitation. Fundamental is the connection between a quantum computational model and a physical system by transformations of operator algebras. This concept is a necessary one because in Quantum Mechanics each physical system is naturally associated with a language of operators and thus can be considered as a possible model of quantum computation. The remarkable result is that an arbitrary physical system is naturally simulatable by another physical system (or QC) whenever a ``dictionary'' between the two operator algebras exists. I will explain these concepts and address some of Feynman's concerns regarding the simulation of fermionic systems. Finally, I will illustrate the main ideas by imitating simple physical phenomena borrowed from condensed matter physics using quantum algorithms, and present experimental

2. Uncertainty and error in computational simulations

SciTech Connect

Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

1997-10-01

The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

3. Simulation Concept - How to Exploit Tools for Computing Hybrids

DTIC Science & Technology

2009-07-01

multiphysics design tools (Simulation of Biological Systems - SIMBIOSYS ), provide an open source environment for biological simulation tools (Bio...SCHETCH Simulation Concept – How to Exploit Tools for Computing Project SIMBIOSYS Simulation of Biological Systems Program SPICE Simulation

4. CHARMM additive and polarizable force fields for biophysics and computer-aided drug design

PubMed Central

Vanommeslaeghe, K.

2014-01-01

Background Molecular Mechanics (MM) is the method of choice for computational studies of biomolecular systems owing to its modest computational cost, which makes it possible to routinely perform molecular dynamics (MD) simulations on chemical systems of biophysical and biomedical relevance. Scope of Review As one of the main factors limiting the accuracy of MD results is the empirical force field used, the present paper offers a review of recent developments in the CHARMM additive force field, one of the most popular bimolecular force fields. Additionally, we present a detailed discussion of the CHARMM Drude polarizable force field, anticipating a growth in the importance and utilization of polarizable force fields in the near future. Throughout the discussion emphasis is placed on the force fields’ parametrization philosophy and methodology. Major Conclusions Recent improvements in the CHARMM additive force field are mostly related to newly found weaknesses in the previous generation of additive force fields. Beyond the additive approximation is the newly available CHARMM Drude polarizable force field, which allows for MD simulations of up to 1 microsecond on proteins, DNA, lipids and carbohydrates. General Significance Addressing the limitations ensures the reliability of the new CHARMM36 additive force field for the types of calculations that are presently coming into routine computational reach while the availability of the Drude polarizable force fields offers a model that is an inherently more accurate model of the underlying physical forces driving macromolecular structures and dynamics. PMID:25149274

5. Assessing Moderator Variables: Two Computer Simulation Studies.

ERIC Educational Resources Information Center

Mason, Craig A.; And Others

1996-01-01

A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

6. Designing Online Scaffolds for Interactive Computer Simulation

ERIC Educational Resources Information Center

Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

2013-01-01

The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high school…

7. Making Students Decide: The Vietnam Computer Simulation.

ERIC Educational Resources Information Center

O'Reilly, Kevin

1994-01-01

Contends that an important goal in history instruction is helping students understand the complexity of events. Describes the use of "Escalation," a commercially available computer simulation, in a high school U.S. history class. Includes excerpts from student journals kept during the activity. (CFR)

8. Progress in Computational Simulation of Earthquakes

NASA Technical Reports Server (NTRS)

Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

2006-01-01

GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

9. Factors Promoting Engaged Exploration with Computer Simulations

ERIC Educational Resources Information Center

Podolefsky, Noah S.; Perkins, Katherine K.; Adams, Wendy K.

2010-01-01

This paper extends prior research on student use of computer simulations (sims) to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze…

10. Macromod: Computer Simulation For Introductory Economics

ERIC Educational Resources Information Center

Ross, Thomas

1977-01-01

The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

11. Computer Graphics Simulations of Sampling Distributions.

ERIC Educational Resources Information Center

Gordon, Florence S.; Gordon, Sheldon P.

1989-01-01

Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

12. Quantitative computer simulations of extraterrestrial processing operations

NASA Technical Reports Server (NTRS)

Vincent, T. L.; Nikravesh, P. E.

1989-01-01

The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

13. Computer simulations of WIGWAM underwater experiment

SciTech Connect

Kamegai, Minao; White, J.W.

1993-11-01

We performed computer simulations of the WIGWAM underwater experiment with a 2-D hydro-code, CALE. First, we calculated the bubble pulse and the signal strength at the closest gauge in one-dimensional geometry. The calculation shows excellent agreement with the measured data. Next, we made two-dimensional simulations of WIGWAM applying the gravity over-pressure, and calculated the signals at three selected gauge locations where measurements were recorded. The computed peak pressures at those gauge locations come well within the 15% experimental error bars. The signal at the farthest gauge is of the order of 200 bars. This is significant, because at this pressure the CALE output can be linked to a hydro-acoustics computer program, NPE Code (Nonlinear Progressive Wave-equation Code), to analyze the long distance propagation of acoustical signals from the underwater explosions on a global scale.

14. Computational algorithms for simulations in atmospheric optics.

PubMed

Konyaev, P A; Lukin, V P

2016-04-20

A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

15. Simulating fermions on a quantum computer

Ortiz, G.; Gubernatis, J. E.; Knill, E.; Laflamme, R.

2002-07-01

The real-time probabilistic simulation of quantum systems in classical computers is known to be limited by the so-called dynamical sign problem, a problem leading to exponential complexity. In 1981 Richard Feynman raised some provocative questions in connection to the "exact imitation" of such systems using a special device named a "quantum computer". Feynman hesitated about the possibility of imitating fermion systems using such a device. Here we address some of his concerns and, in particular, investigate the simulation of fermionic systems. We show how quantum computers avoid the sign problem in some cases by reducing the complexity from exponential to polynomial. Our demonstration is based upon the use of isomorphisms of algebras. We present specific quantum algorithms that illustrate the main points of our algebraic approach.

16. Computer simulation of surface and film processes

NASA Technical Reports Server (NTRS)

Tiller, W. A.; Halicioglu, M. T.

1983-01-01

Adequate computer methods, based on interactions between discrete particles, provide information leading to an atomic level understanding of various physical processes. The success of these simulation methods, however, is related to the accuracy of the potential energy function representing the interactions among the particles. The development of a potential energy function for crystalline SiO2 forms that can be employed in lengthy computer modelling procedures was investigated. In many of the simulation methods which deal with discrete particles, semiempirical two body potentials were employed to analyze energy and structure related properties of the system. Many body interactions are required for a proper representation of the total energy for many systems. Many body interactions for simulations based on discrete particles are discussed.

17. Computational Simulation of Composite Structural Fatigue

NASA Technical Reports Server (NTRS)

Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

2005-01-01

Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

18. Cosmological Simulations on a Grid of Computers

Depardon, Benjamin; Caron, Eddy; Desprez, Frédéric; Blaizot, Jérémy; Courtois, Hélène

2010-06-01

The work presented in this paper aims at restricting the input parameter values of the semi-analytical model used in GALICS and MOMAF, so as to derive which parameters influence the most the results, e.g., star formation, feedback and halo recycling efficiencies, etc. Our approach is to proceed empirically: we run lots of simulations and derive the correct ranges of values. The computation time needed is so large, that we need to run on a grid of computers. Hence, we model GALICS and MOMAF execution time and output files size, and run the simulation using a grid middleware: DIET. All the complexity of accessing resources, scheduling simulations and managing data is harnessed by DIET and hidden behind a web portal accessible to the users.

19. Computational Simulation of Composite Structural Fatigue

NASA Technical Reports Server (NTRS)

Minnetyan, Levon

2004-01-01

Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

20. Multiscale simulation process and application to additives in porous composite battery electrodes

Wieser, Christian; Prill, Torben; Schladitz, Katja

2015-03-01

Structure-resolving simulation of porous materials in electrochemical cells such as fuel cells and lithium ion batteries allows for correlating electrical performance with material morphology. In lithium ion batteries characteristic length scales of active material particles and additives range several orders of magnitude. Hence, providing a computational mesh resolving all length scales is not reasonably feasible and requires alternative approaches. In the work presented here a virtual process to simulate lithium ion batteries by bridging the scales is introduced. Representative lithium ion battery electrode coatings comprised of μm-scale graphite particles as active material and a nm-scale carbon/polymeric binder mixture as an additive are imaged with synchrotron radiation computed tomography (SR-CT) and sequential focused ion beam/scanning electron microscopy (FIB/SEM), respectively. Applying novel image processing methodologies for the FIB/SEM images, data sets are binarized to provide a computational grid for calculating the effective mass transport properties of the electrolyte phase in the nanoporous additive. Afterwards, the homogenized additive is virtually added to the micropores of the binarized SR-CT data set representing the active particle structure, and the resulting electrode structure is assembled to a virtual half-cell for electrochemical microheterogeneous simulation. Preliminary battery performance simulations indicate non-negligible impact of the consideration of the additive.

1. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

PubMed

Gür, Y

2014-12-01

The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models.

2. Simulation of Laser Additive Manufacturing and its Applications

Lee, Yousub

Laser and metal powder based additive manufacturing (AM), a key category of advanced Direct Digital Manufacturing (DDM), produces metallic components directly from a digital representation of the part such as a CAD file. It is well suited for the production of high-value, customizable components with complex geometry and the repair of damaged components. Currently, the main challenges for laser and metal powder based AM include the formation of defects (e.g., porosity), low surface finish quality, and spatially non-uniform properties of material. Such challenges stem largely from the limited knowledge of complex physical processes in AM especially the molten pool physics such as melting, molten metal flow, heat conduction, vaporization of alloying elements, and solidification. Direct experimental measurement of melt pool phenomena is highly difficult since the process is localized (on the order of 0.1 mm to 1 mm melt pool size) and transient (on the order of 1 m/s scanning speed). Furthermore, current optical and infrared cameras are limited to observe the melt pool surface. As a result, fluid flows in the melt pool, melt pool shape and formation of sub-surface defects are difficult to be visualized by experiment. On the other hand, numerical simulation, based on rigorous solution of mass, momentum and energy transport equations, can provide important quantitative knowledge of complex transport phenomena taking place in AM. The overarching goal of this dissertation research is to develop an analytical foundation for fundamental understanding of heat transfer, molten metal flow and free surface evolution. Two key types of laser AM processes are studied: a) powder injection, commonly used for repairing of turbine blades, and b) powder bed, commonly used for manufacturing of new parts with complex geometry. In the powder injection simulation, fluid convection, temperature gradient (G), solidification rate (R) and melt pool shape are calculated using a heat transfer

3. Computational Challenges in Nuclear Weapons Simulation

SciTech Connect

McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

2003-08-29

After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

4. Computer modeling and simulation of human movement.

PubMed

Pandy, M G

2001-01-01

Recent interest in using modeling and simulation to study movement is driven by the belief that this approach can provide insight into how the nervous system and muscles interact to produce coordinated motion of the body parts. With the computational resources available today, large-scale models of the body can be used to produce realistic simulations of movement that are an order of magnitude more complex than those produced just 10 years ago. This chapter reviews how the structure of the neuromusculoskeletal system is commonly represented in a multijoint model of movement, how modeling may be combined with optimization theory to simulate the dynamics of a motor task, and how model output can be analyzed to describe and explain muscle function. Some results obtained from simulations of jumping, pedaling, and walking are also reviewed to illustrate the approach.

5. Computer simulations of learning in neural systems.

PubMed

Salu, Y

1983-04-01

Recent experiments have shown that, in some cases, strengths of synaptic ties are being modified in learning. However, it is not known what the rules that control those modifications are, especially what determines which synapses will be modified and which will remain unchanged during a learning episode. Two postulated rules that may solve that problem are introduced. To check their effectiveness, the rules are tested in many computer models that simulate learning in neural systems. The simulations demonstrate that, theoretically, the two postulated rules are effective in organizing the synaptic changes. If they are found to also exist in biological systems, these postulated rules may be an important element in the learning process.

6. Weld fracture criteria for computer simulation

NASA Technical Reports Server (NTRS)

Jemian, Wartan A.

1993-01-01

Due to the complexity of welding, not all of the important factors are always properly considered and controlled. An automatic system is required. This report outlines a simulation method and all the important considerations to do this. As in many situations where a defect or failure has occurrred, it is freqently necessary to trouble shoot the system and eventually identify those factors that were neglected. This is expensive and time consuming. Very frequently the causes are materials-related that might have been anticipated. Computer simulation can automatically consider all important variables. The major goal of this presentation is to identify the proper relationship to design, processing and materials variables to welding.

7. Unsteady flow simulation on a parallel computer

Faden, M.; Pokorny, S.; Engel, K.

For the simulation of the flow through compressor stages, an interactive flow simulation system is set up on an MIMD-type parallel computer. An explicit scheme is used in order to resolve the time-dependent interaction between the blades. The 2D Navier-Stokes equations are transformed into their general moving coordinates. The parallelization of the solver is based on the idea of domain decomposition. Results are presented for a problem of fixed size (4096 grid nodes for the Hakkinen case).

8. Computer Simulation of the VASIMR Engine

NASA Technical Reports Server (NTRS)

Garrison, David

2005-01-01

The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

9. Computational plasticity algorithm for particle dynamics simulations

Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.

2017-03-01

The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.

10. Understanding Membrane Fouling Mechanisms through Computational Simulations

Xiang, Yuan

This dissertation focuses on a computational simulation study on the organic fouling mechanisms of reverse osmosis and nanofiltration (RO/NF) membranes, which have been widely used in industry for water purification. The research shows that through establishing a realistic computational model based on available experimental data, we are able to develop a deep understanding of membrane fouling mechanism. This knowledge is critical for providing a strategic plan for membrane experimental community and RO/NF industry for further improvements in membrane technology for water treatment. This dissertation focuses on three major research components (1) Development of the realistic molecular models, which could well represent the membrane surface properties; (2) Investigation of the interactions between the membrane surface and foulants by steered molecular dynamics simulations, in order to determine the major factors that contribute to surface fouling; and (3) Studies of the interactions between the surface-modified membranes (polyethylene glycol) to provide strategies for antifouling.

11. Understanding membrane fouling mechanisms through computational simulations

Xiang, Yuan

This dissertation focuses on a computational simulation study on the organic fouling mechanisms of reverse osmosis and nanofiltration (RO/NF) membranes, which have been widely used in industry for water purification. The research shows that through establishing a realistic computational model based on available experimental data, we are able to develop a deep understanding of membrane fouling mechanism. This knowledge is critical for providing a strategic plan for membrane experimental community and RO/NF industry for further improvements in membrane technology for water treatment. This dissertation focuses on three major research components (1) Development of the realistic molecular models, which could well represent the membrane surface properties; (2) Investigation of the interactions between the membrane surface and foulants by steered molecular dynamics simulations, in order to determine the major factors that contribute to surface fouling; and (3) Studies of the interactions between the surface-modified membranes (polyethylene glycol) to provide strategies for antifouling.

12. X-ray computed tomography for additive manufacturing: a review

Thompson, A.; Maskery, I.; Leach, R. K.

2016-07-01

In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

13. Additional support for the TDK/MABL computer program

NASA Technical Reports Server (NTRS)

Nickerson, G. R.; Dunn, Stuart S.

1993-01-01

An advanced version of the Two-Dimensional Kinetics (TDK) computer program was developed under contract and released to the propulsion community in early 1989. Exposure of the code to this community indicated a need for improvements in certain areas. In particular, the TDK code needed to be adapted to the special requirements imposed by the Space Transportation Main Engine (STME) development program. This engine utilizes injection of the gas generator exhaust into the primary nozzle by means of a set of slots. The subsequent mixing of this secondary stream with the primary stream with finite rate chemical reaction can have a major impact on the engine performance and the thermal protection of the nozzle wall. In attempting to calculate this reacting boundary layer problem, the Mass Addition Boundary Layer (MABL) module of TDK was found to be deficient in several respects. For example, when finite rate chemistry was used to determine gas properties, (MABL-K option) the program run times became excessive because extremely small step sizes were required to maintain numerical stability. A robust solution algorithm was required so that the MABL-K option could be viable as a rocket propulsion industry design tool. Solving this problem was a primary goal of the phase 1 work effort.

14. Computer simulation of the micropulse imaging lidar

Dai, Yongjiang; Zhao, Hongwei; Zhao, Yu; Wang, Xiaoou

2000-10-01

In this paper a design method of the Micro Pulse Lidar (MPL) is introduced, that is a computer simulation of the MPL. Some of the MPL parameters concerned air scattered and the effects on the performance of the lidar are discussed. The design software for the lidar with diode pumped solid laser is programmed by MATLAB. This software is consisted of six modules, that is transmitter, atmosphere, target, receiver, processor and display system. The method can be extended some kinds of lidar.

15. Computer simulation improves remedial cementing success

SciTech Connect

Kulakofsky, D.; Creel, P. )

1992-11-01

This paper reports that computer simulation has been used successfully to design remedial cement squeeze jobs and efficiently evaluate actual downhole performance and results. The program uses fluid properties, well parameters and wellbore configuration to estimate surface pressure at progressive stages of pumping operations. This new tool predicts surface pumping pressures in advance, allowing operators to effectively address changes that occur downhole during workover operations.

16. Integrated computer simulation on FIR FEL dynamics

SciTech Connect

Furukawa, H.; Kuruma, S.; Imasaki, K.

1995-12-31

An integrated computer simulation code has been developed to analyze the RF-Linac FEL dynamics. First, the simulation code on the electron beam acceleration and transport processes in RF-Linac: (LUNA) has been developed to analyze the characteristics of the electron beam in RF-Linac and to optimize the parameters of RF-Linac. Second, a space-time dependent 3D FEL simulation code (Shipout) has been developed. The RF-Linac FEL total simulations have been performed by using the electron beam data from LUNA in Shipout. The number of particles using in a RF-Linac FEL total simulation is approximately 1000. The CPU time for the simulation of 1 round trip is about 1.5 minutes. At ILT/ILE, Osaka, a 8.5MeV RF-Linac with a photo-cathode RF-gun is used for FEL oscillation experiments. By using 2 cm wiggler, the FEL oscillation in the wavelength approximately 46 {mu}m are investigated. By the simulations using LUNA with the parameters of an ILT/ILE experiment, the pulse shape and the energy spectra of the electron beam at the end of the linac are estimated. The pulse shape of the electron beam at the end of the linac has sharp rise-up and it slowly decays as a function of time. By the RF-linac FEL total simulations with the parameters of an ILT/ILE experiment, the dependencies of the start up of the FEL oscillations on the pulse shape of the electron beam at the end of the linac are estimated. The coherent spontaneous emission effects and the quick start up of FEL oscillations have been observed by the RF-Linac FEL total simulations.

17. Accelerating Climate Simulations Through Hybrid Computing

NASA Technical Reports Server (NTRS)

Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

2009-01-01

Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

18. Experiments and simulation models of a basic computation element of an autonomous molecular computing system.

PubMed

Takinoue, Masahiro; Kiga, Daisuke; Shohda, Koh-Ichiroh; Suyama, Akira

2008-10-01

Autonomous DNA computers have been attracting much attention because of their ability to integrate into living cells. Autonomous DNA computers can process information through DNA molecules and their molecular reactions. We have already proposed an idea of an autonomous molecular computer with high computational ability, which is now named Reverse-transcription-and-TRanscription-based Autonomous Computing System (RTRACS). In this study, we first report an experimental demonstration of a basic computation element of RTRACS and a mathematical modeling method for RTRACS. We focus on an AND gate, which produces an output RNA molecule only when two input RNA molecules exist, because it is one of the most basic computation elements in RTRACS. Experimental results demonstrated that the basic computation element worked as designed. In addition, its behaviors were analyzed using a mathematical model describing the molecular reactions of the RTRACS computation elements. A comparison between experiments and simulations confirmed the validity of the mathematical modeling method. This study will accelerate construction of various kinds of computation elements and computational circuits of RTRACS, and thus advance the research on autonomous DNA computers.

19. Computer model to simulate testing at the National Transonic Facility

NASA Technical Reports Server (NTRS)

Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.

1995-01-01

A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.

20. Additional Developments in Atmosphere Revitalization Modeling and Simulation

NASA Technical Reports Server (NTRS)

Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.; Gomez, Carlos

2013-01-01

NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project within the AES program.

1. Additional Developments in Atmosphere Revitalization Modeling and Simulation

NASA Technical Reports Server (NTRS)

Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.

2013-01-01

NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM)

2. Neural network computer simulation of medical aerosols.

PubMed

Richardson, C J; Barlow, D J

1996-06-01

Preliminary investigations have been conducted to assess the potential for using artificial neural networks to simulate aerosol behaviour, with a view to employing this type of methodology in the evaluation and design of pulmonary drug-delivery systems. Details are presented of the general purpose software developed for these tasks; it implements a feed-forward back-propagation algorithm with weight decay and connection pruning, the user having complete run-time control of the network architecture and mode of training. A series of exploratory investigations is then reported in which different network structures and training strategies are assessed in terms of their ability to simulate known patterns of fluid flow in simple model systems. The first of these involves simulations of cellular automata-generated data for fluid flow through a partially obstructed two-dimensional pipe. The artificial neural networks are shown to be highly successful in simulating the behaviour of this simple linear system, but with important provisos relating to the information content of the training data and the criteria used to judge when the network is properly trained. A second set of investigations is then reported in which similar networks are used to simulate patterns of fluid flow through aerosol generation devices, using training data furnished through rigorous computational fluid dynamics modelling. These more complex three-dimensional systems are modelled with equal success. It is concluded that carefully tailored, well trained networks could provide valuable tools not just for predicting but also for analysing the spatial dynamics of pharmaceutical aerosols.

3. New Computer Simulations of Macular Neural Functioning

NASA Technical Reports Server (NTRS)

Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

1994-01-01

We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

4. Computer Simulation of the Beating Human Heart

Peskin, Charles S.; McQueen, David M.

2001-06-01

The mechanical function of the human heart couples together the fluid mechanics of blood and the soft tissue mechanics of the muscular heart walls and flexible heart valve leaflets. We discuss a unified mathematical formulation of this problem in which the soft tissue looks like a specialized part of the fluid in which additional forces are applied. This leads to a computational scheme known as the Immersed Boundary (IB) method for solving the coupled equations of motion of the whole system. The IB method is used to construct a three-dimensional Virtual Heart, including representations of all four chambers of the heart and all four valves, in addition to the large arteries and veins that connect the heart to the rest of the circulation. The chambers, valves, and vessels are all modeled as collections of elastic (and where appropriate, actively contractile) fibers immersed in viscous incompressible fluid. Results are shown as a computer-generated video animation of the beating heart.

5. Human shank experimental investigation and computer simulation

Krasnoschekov, Viktor V.; Maslov, Leonid B.

2000-01-01

A new combined approach to analyze a physiological state of the human shank is developed. Investigated vibration research complex records resonance curve of the shank tissues automatically for different kinds of vibration excitation and for various positions of the foot. A special computer model is implemented for the estimation of the experimental data, for a priori prognosis of the bio-object behavior and its dynamic characteristics in the case of various kinds and of different degrees of injury. The method is described by the viscous-elasticity non-homogeneous 1D continuum equation. It is solved by finite element method. The problem in shank cross-section is solved by boundary element method. The analysis of computer simulated resonance curves makes it possible to understand the experimental data correctly and to check the diagnostic criteria of the injury.

6. Investigation of Carbohydrate Recognition via Computer Simulation

SciTech Connect

Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

2015-04-28

Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

7. Investigation of Carbohydrate Recognition via Computer Simulation

DOE PAGES

Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; ...

2015-04-28

Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

8. Fast computation algorithms for speckle pattern simulation

SciTech Connect

Nascov, Victor; Samoilă, Cornel; Ursuţiu, Doru

2013-11-13

We present our development of a series of efficient computation algorithms, generally usable to calculate light diffraction and particularly for speckle pattern simulation. We use mainly the scalar diffraction theory in the form of Rayleigh-Sommerfeld diffraction formula and its Fresnel approximation. Our algorithms are based on a special form of the convolution theorem and the Fast Fourier Transform. They are able to evaluate the diffraction formula much faster than by direct computation and we have circumvented the restrictions regarding the relative sizes of the input and output domains, met on commonly used procedures. Moreover, the input and output planes can be tilted each to other and the output domain can be off-axis shifted.

9. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

2015-09-01

The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

10. Computer simulations of charged colloids in confinement.

PubMed

Puertas, Antonio M; de las Nieves, F Javier; Cuetos, Alejandro

2015-02-15

We study by computer simulations the interaction between two similarly charged colloidal particles confined between parallel planes, in salt free conditions. Both the colloids and ions are simulated explicitly, in a fine-mesh lattice, and the electrostatic interaction is calculated using Ewald summation in two dimensions. The internal energy is measured by setting the colloidal particles at a given position and equilibrating the ions, whereas the free energy is obtained introducing a bias (attractive) potential between the colloids. Our results show that upon confining the system, the internal energy decreases, resulting in an attractive contribution to the interaction potential for large charges and strong confinement. However, the loss of entropy of the ions is the dominant mechanism in the interaction, irrespective of the confinement of the system. The interaction potential is therefore repulsive in all cases, and is well described by the DLVO functional form, but effective values have to be used for the interaction strength and Debye length.

11. Computational simulation of the blood separation process.

PubMed

De Gruttola, Sandro; Boomsma, Kevin; Poulikakos, Dimos; Ventikos, Yiannis

2005-08-01

The aim of this work is to construct a computational fluid dynamics model capable of simulating the quasitransient process of apheresis. To this end a Lagrangian-Eulerian model has been developed which tracks the blood particles within a delineated two-dimensional flow domain. Within the Eulerian method, the fluid flow conservation equations within the separator are solved. Taking the calculated values of the flow field and using a Lagrangian method, the displacement of the blood particles is calculated. Thus, the local blood density within the separator at a given time step is known. Subsequently, the flow field in the separator is recalculated. This process continues until a quasisteady behavior is reached. The simulations show good agreement with experimental results. They shows a complete separation of plasma and red blood cells, as well as nearly complete separation of red blood cells and platelets. The white blood cells build clusters in the low concentrate cell bed.

12. Computer simulation of solder joint failure

SciTech Connect

Burchett, S.N.; Frear, D.R.; Rashid, M.M.

1997-04-01

The thermomechanical fatigue failure of solder joints is increasingly becoming an important reliability issue for electronic packages. The purpose of this Laboratory Directed Research and Development (LDRD) project was to develop computational tools for simulating the behavior of solder joints under strain and temperature cycling, taking into account the microstructural heterogeneities that exist in as-solidified near eutectic Sn-Pb joints, as well as subsequent microstructural evolution. The authors present two computational constitutive models, a two-phase model and a single-phase model, that were developed to predict the behavior of near eutectic Sn-Pb solder joints under fatigue conditions. Unique metallurgical tests provide the fundamental input for the constitutive relations. The two-phase model mathematically predicts the heterogeneous coarsening behavior of near eutectic Sn-Pb solder. The finite element simulations with this model agree qualitatively with experimental thermomechanical fatigue tests. The simulations show that the presence of an initial heterogeneity in the solder microstructure could significantly degrade the fatigue lifetime. The single-phase model was developed to predict solder joint behavior using materials data for constitutive relation constants that could be determined through straightforward metallurgical experiments. Special thermomechanical fatigue tests were developed to give fundamental materials input to the models, and an in situ SEM thermomechanical fatigue test system was developed to characterize microstructural evolution and the mechanical behavior of solder joints during the test. A shear/torsion test sample was developed to impose strain in two different orientations. Materials constants were derived from these tests. The simulation results from the two-phase model showed good fit to the experimental test results.

13. Computer Simulation of Fracture in Aerogels

NASA Technical Reports Server (NTRS)

Good, Brian S.

2006-01-01

Aerogels are of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While the gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. In this work, we investigate the strength and fracture behavior of silica aerogels using a molecular statics-based computer simulation technique. The gels' structure is simulated via a Diffusion Limited Cluster Aggregation (DLCA) algorithm, which produces fractal structures representing experimentally observed aggregates of so-called secondary particles, themselves composed of amorphous silica primary particles an order of magnitude smaller. We have performed multi-length-scale simulations of fracture in silica aerogels, in which the interaction b e e n two secondary particles is assumed to be described by a Morse pair potential parameterized such that the potential range is much smaller than the secondary particle size. These Morse parameters are obtained by atomistic simulation of models of the experimentally-observed amorphous silica "bridges," with the fracture behavior of these bridges modeled via molecular statics using a Morse/Coulomb potential for silica. We consider the energetics of the fracture, and compare qualitative features of low-and high-density gel fracture.

14. The Learning Effects of Computer Simulations in Science Education

ERIC Educational Resources Information Center

Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

2012-01-01

This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

15. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

Lee, Yu-Fen

This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit

16. Computational simulation of liquid fuel rocket injectors

NASA Technical Reports Server (NTRS)

Landrum, D. Brian

1994-01-01

A major component of any liquid propellant rocket is the propellant injection system. Issues of interest include the degree of liquid vaporization and its impact on the combustion process, the pressure and temperature fields in the combustion chamber, and the cooling of the injector face and chamber walls. The Finite Difference Navier-Stokes (FDNS) code is a primary computational tool used in the MSFC Computational Fluid Dynamics Branch. The branch has dedicated a significant amount of resources to development of this code for prediction of both liquid and solid fuel rocket performance. The FDNS code is currently being upgraded to include the capability to model liquid/gas multi-phase flows for fuel injection simulation. An important aspect of this effort is benchmarking the code capabilities to predict existing experimental injection data. The objective of this MSFC/ASEE Summer Faculty Fellowship term was to evaluate the capabilities of the modified FDNS code to predict flow fields with liquid injection. Comparisons were made between code predictions and existing experimental data. A significant portion of the effort included a search for appropriate validation data. Also, code simulation deficiencies were identified.

17. A Computational Framework for Bioimaging Simulation

PubMed Central

Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

2015-01-01

Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

18. Computational simulation for concurrent engineering of aerospace propulsion systems

NASA Technical Reports Server (NTRS)

Chamis, C. C.; Singhal, S. N.

1993-01-01

Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

19. Computational simulation of concurrent engineering for aerospace propulsion systems

NASA Technical Reports Server (NTRS)

Chamis, C. C.; Singhal, S. N.

1992-01-01

Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

20. Ku-Band rendezvous radar performance computer simulation model

NASA Technical Reports Server (NTRS)

Magnusson, H. G.; Goff, M. F.

1984-01-01

All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

1. Computer simulation of fatigue under diametrical compression

SciTech Connect

Carmona, H. A.; Kun, F.; Andrade, J. S. Jr.; Herrmann, H. J.

2007-04-15

We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.

2. Computer simulation of surface and film processes

NASA Technical Reports Server (NTRS)

Tiller, W. A.; Halicioglu, M. T.

1984-01-01

All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.

3. Adding computationally efficient realism to Monte Carlo turbulence simulation

NASA Technical Reports Server (NTRS)

Campbell, C. W.

1985-01-01

Frequently in aerospace vehicle flight simulation, random turbulence is generated using the assumption that the craft is small compared to the length scales of turbulence. The turbulence is presumed to vary only along the flight path of the vehicle but not across the vehicle span. The addition of the realism of three-dimensionality is a worthy goal, but any such attempt will not gain acceptance in the simulator community unless it is computationally efficient. A concept for adding three-dimensional realism with a minimum of computational complexity is presented. The concept involves the use of close rational approximations to irrational spectra and cross-spectra so that systems of stable, explicit difference equations can be used to generate the turbulence.

4. A framework of modeling detector systems for computed tomography simulations

Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

2016-01-01

Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

5. Chip level simulation of fault tolerant computers

NASA Technical Reports Server (NTRS)

Armstrong, J. R.

1983-01-01

Chip level modeling techniques, functional fault simulation, simulation software development, a more efficient, high level version of GSP, and a parallel architecture for functional simulation are discussed.

6. Additional extensions to the NASCAP computer code, volume 3

NASA Technical Reports Server (NTRS)

Mandell, M. J.; Cooke, D. L.

1981-01-01

The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

7. A Mass Spectrometer Simulator in Your Computer

Gagnon, Michel

2012-12-01

Introduced to study components of ionized gas, the mass spectrometer has evolved into a highly accurate device now used in many undergraduate and research laboratories. Unfortunately, despite their importance in the formation of future scientists, mass spectrometers remain beyond the financial reach of many high schools and colleges. As a result, it is not possible for instructors to take full advantage of this equipment. Therefore, to facilitate accessibility to this tool, we have developed a realistic computer-based simulator. Using this software, students are able to practice their ability to identify the components of the original gas, thereby gaining a better understanding of the underlying physical laws. The software is available as a free download.

8. Miller experiments in atomistic computer simulations

PubMed Central

Saitta, Antonino Marco; Saija, Franz

2014-01-01

The celebrated Miller experiments reported on the spontaneous formation of amino acids from a mixture of simple molecules reacting under an electric discharge, giving birth to the research field of prebiotic chemistry. However, the chemical reactions involved in those experiments have never been studied at the atomic level. Here we report on, to our knowledge, the first ab initio computer simulations of Miller-like experiments in the condensed phase. Our study, based on the recent method of treatment of aqueous systems under electric fields and on metadynamics analysis of chemical reactions, shows that glycine spontaneously forms from mixtures of simple molecules once an electric field is switched on and identifies formic acid and formamide as key intermediate products of the early steps of the Miller reactions, and the crucible of formation of complex biological molecules. PMID:25201948

9. Protein Dynamics from NMR and Computer Simulation

Wu, Qiong; Kravchenko, Olga; Kemple, Marvin; Likic, Vladimir; Klimtchuk, Elena; Prendergast, Franklyn

2002-03-01

Proteins exhibit internal motions from the millisecond to sub-nanosecond time scale. The challenge is to relate these internal motions to biological function. A strategy to address this aim is to apply a combination of several techniques including high-resolution NMR, computer simulation of molecular dynamics (MD), molecular graphics, and finally molecular biology, the latter to generate appropriate samples. Two difficulties that arise are: (1) the time scale which is most directly biologically relevant (ms to μs) is not readily accessible by these techniques and (2) the techniques focus on local and not collective motions. We will outline methods using ^13C-NMR to help alleviate the second problem, as applied to intestinal fatty acid binding protein, a relatively small intracellular protein believed to be involved in fatty acid transport and metabolism. This work is supported in part by PHS Grant GM34847 (FGP) and by a fellowship from the American Heart Association (QW).

10. Adv. Simulation for Additive Manufacturing: 11/2014 Wkshp. Report for U.S. DOE/EERE/AMO

SciTech Connect

Turner, John A.; Babu, Sudarsanam Suresh; Blue, Craig A.

2015-07-01

The overarching question for the workshop was as following: How do we best utilize advanced modeling and high-performance computing (HPC) to address key challenges and opportunities in order to realize the full potential of additive manufacturing; and what are the key challenges of additive manufacturing to which modeling and simulation can contribute solutions, and what will it take to meet these challenges?

11. Computer simulation of super-resolution point source image detection

Fillard, Jean-Pierre; M'timet, H.; Lussert, Jean-Marc; Castagne, Michel

1993-11-01

We present a computer simulation of the analysis of an `in-focus' 2D Airy disk. Two competing methods are used to calculate the coordinates of the center of this point spread function image. The first one is the classical technique that relies on the 2D `centroid' of the image, and the second one is a more original method that uses the frequency dependence of the argument of the Fourier transform. Comparative simulations show that the latter technique [Fourier phase shift (FPS)] allows us to obtain a very good precision of better than 1% of a pixel spacing after quantization. Perturbations such as dc offset reduction, quantization noise, and additive Gaussian noise are introduced in the simulation. The results show that there is an improved perturbation immunity for the FPS method.

12. Ceramic matrix composite behavior -- Computational simulation

SciTech Connect

Chamis, C.C.; Murthy, P.L.N.; Mital, S.K.

1996-10-01

Development of analytical modeling and computational capabilities for the prediction of high temperature ceramic matrix composite behavior has been an ongoing research activity at NASA-Lewis Research Center. These research activities have resulted in the development of micromechanics based methodologies to evaluate different aspects of ceramic matrix composite behavior. The basis of the approach is micromechanics together with a unique fiber substructuring concept. In this new concept the conventional unit cell (the smallest representative volume element of the composite) of micromechanics approach has been modified by substructuring the unit cell into several slices and developing the micromechanics based equations at the slice level. Main advantage of this technique is that it can provide a much greater detail in the response of composite behavior as compared to a conventional micromechanics based analysis and still maintains a very high computational efficiency. This methodology has recently been extended to model plain weave ceramic composites. The objective of the present paper is to describe the important features of the modeling and simulation and illustrate with select examples of laminated as well as woven composites.

13. Experiential Learning through Computer-Based Simulations.

ERIC Educational Resources Information Center

Maynes, Bill; And Others

1992-01-01

Describes experiential learning instructional model and simulation for student principals. Describes interactive laser videodisc simulation. Reports preliminary findings about student principal learning from simulation. Examines learning approaches by unsuccessful and successful students and learning levels of model learners. Simulation's success…

14. Additional extensions to the NASCAP computer code, volume 1

NASA Technical Reports Server (NTRS)

Mandell, M. J.; Katz, I.; Stannard, P. R.

1981-01-01

Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

15. Computational Modeling and Simulation of Genital Tubercle ...

EPA Pesticide Factsheets

Hypospadias is a developmental defect of urethral tube closure that has a complex etiology. Here, we describe a multicellular agent-based model of genital tubercle development that simulates urethrogenesis from the urethral plate stage to urethral tube closure in differentiating male embryos. The model, constructed in CompuCell3D, implemented spatially dynamic signals from SHH, FGF10, and androgen signaling pathways. These signals modulated stochastic cell behaviors, such as differential adhesion, cell motility, proliferation, and apoptosis. Urethral tube closure was an emergent property of the model that was quantitatively dependent on SHH and FGF10 induced effects on mesenchymal proliferation and endodermal apoptosis, ultimately linked to androgen signaling. In the absence of androgenization, simulated genital tubercle development defaulted to the female condition. Intermediate phenotypes associated with partial androgen deficiency resulted in incomplete closure. Using this computer model, complex relationships between urethral tube closure defects and disruption of underlying signaling pathways could be probed theoretically in multiplex disturbance scenarios and modeled into probabilistic predictions for individual risk for hypospadias and potentially other developmental defects of the male genital tubercle. We identify the minimal molecular network that determines the outcome of male genital tubercle development in mice.

16. Computer simulations of the mouse spermatogenic cycle.

PubMed

Ray, Debjit; Pitts, Philip B; Hogarth, Cathryn A; Whitmore, Leanne S; Griswold, Michael D; Ye, Ping

2014-12-12

The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal-spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.

17. Space Shuttle flight crew/computer interface simulation studies.

NASA Technical Reports Server (NTRS)

Callihan, J. C.; Rybarczyk, D. T.

1972-01-01

An approach to achieving an optimized set of crew/computer interface requirements on the Space Shuttle program is described. It consists of defining the mission phases and crew timelines, developing a functional description of the crew/computer interface displays and controls software, conducting real-time simulations using pilot evaluation of the interface displays and controls, and developing a set of crew/computer functional requirements specifications. The simulator is a two-man crew station which includes three CRTs with keyboards for simulating the crew/computer interface. The programs simulate the mission phases and the flight hardware, including the flight computer and CRT displays.

18. Comparing Computer Run Time of Building Simulation Programs

SciTech Connect

Hong, Tianzhen; Buhl, Fred; Haves, Philip; Selkowitz, Stephen; Wetter, Michael

2008-07-23

This paper presents an approach to comparing computer run time of building simulation programs. The computing run time of a simulation program depends on several key factors, including the calculation algorithm and modeling capabilities of the program, the run period, the simulation time step, the complexity of the energy models, the run control settings, and the software and hardware configurations of the computer that is used to make the simulation runs. To demonstrate the approach, simulation runs are performed for several representative DOE-2.1E and EnergyPlus energy models. The computer run time of these energy models are then compared and analyzed.

19. Engineering Fracking Fluids with Computer Simulation

Shaqfeh, Eric

2015-11-01

There are no comprehensive simulation-based tools for engineering the flows of viscoelastic fluid-particle suspensions in fully three-dimensional geometries. On the other hand, the need for such a tool in engineering applications is immense. Suspensions of rigid particles in viscoelastic fluids play key roles in many energy applications. For example, in oil drilling the ``drilling mud'' is a very viscous, viscoelastic fluid designed to shear-thin during drilling, but thicken at stoppage so that the ``cuttings'' can remain suspended. In a related application known as hydraulic fracturing suspensions of solids called ``proppant'' are used to prop open the fracture by pumping them into the well. It is well-known that particle flow and settling in a viscoelastic fluid can be quite different from that which is observed in Newtonian fluids. First, it is now well known that the ``fluid particle split'' at bifurcation cracks is controlled by fluid rheology in a manner that is not understood. Second, in Newtonian fluids, the presence of an imposed shear flow in the direction perpendicular to gravity (which we term a cross or orthogonal shear flow) has no effect on the settling of a spherical particle in Stokes flow (i.e. at vanishingly small Reynolds number). By contrast, in a non-Newtonian liquid, the complex rheological properties induce a nonlinear coupling between the sedimentation and shear flow. Recent experimental data have shown both the shear thinning and the elasticity of the suspending polymeric solutions significantly affects the fluid-particle split at bifurcations, as well as the settling rate of the solids. In the present work, we use the Immersed Boundary Method to develop computer simulations of viscoelastic flow in suspensions of spheres to study these problems. These simulations allow us to understand the detailed physical mechanisms for the remarkable physical behavior seen in practice, and actually suggest design rules for creating new fluid recipes.

20. Simulation of Powder Layer Deposition in Additive Manufacturing Processes Using the Discrete Element Method

SciTech Connect

Herbold, E. B.; Walton, O.; Homel, M. A.

2015-10-26

This document serves as a final report to a small effort where several improvements were added to a LLNL code GEODYN-­L to develop Discrete Element Method (DEM) algorithms coupled to Lagrangian Finite Element (FE) solvers to investigate powder-­bed formation problems for additive manufacturing. The results from these simulations will be assessed for inclusion as the initial conditions for Direct Metal Laser Sintering (DMLS) simulations performed with ALE3D. The algorithms were written and performed on parallel computing platforms at LLNL. The total funding level was 3-­4 weeks of an FTE split amongst two staff scientists and one post-­doc. The DEM simulations emulated, as much as was feasible, the physical process of depositing a new layer of powder over a bed of existing powder. The DEM simulations utilized truncated size distributions spanning realistic size ranges with a size distribution profile consistent with realistic sample set. A minimum simulation sample size on the order of 40-­particles square by 10-­particles deep was utilized in these scoping studies in order to evaluate the potential effects of size segregation variation with distance displaced in front of a screed blade. A reasonable method for evaluating the problem was developed and validated. Several simulations were performed to show the viability of the approach. Future investigations will focus on running various simulations investigating powder particle sizing and screen geometries.

1. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

SciTech Connect

C. FOSTER; ET AL

2001-01-01

The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

2. Computer-aided Instructional System for Transmission Line Simulation.

ERIC Educational Resources Information Center

Reinhard, Erwin A.; Roth, Charles H., Jr.

A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

3. Research of the grid computing system applied in optical simulation

Jin, Wei-wei; Wang, Yu-dong; Liu, Qiangsheng; Cen, Zhao-feng; Li, Xiao-tong; Lin, Yi-qun

2008-03-01

A grid computing in the field of optics is presented in this paper. Firstly, the basic principles and research background of grid computing are outlined in this paper, along with the overview of its applications and the development status quo. The paper also discusses several typical tasks scheduling algorithms. Secondly, it focuses on describing a task scheduling of grid computing applied in optical computation. The paper gives details about the task scheduling system, including the task partition, granularity selection and tasks allocation, especially the structure of the system. In addition, some details of communication on grid computing are also illustrated. In this system, the "makespan" and "load balancing" are comprehensively considered. Finally, we build a grid model to test the task scheduling strategy, and the results are analyzed in detail. Compared to one isolated computer, a grid comprised of one server and four processors can shorten the "makespan" to 1/4. At the same time, the experimental results of the simulation also illustrate that the proposed scheduling system is able to balance loads of all processors. In short, the system performs scheduling well in the grid environment.

4. Computer-aided simulation study of photomultiplier tubes

NASA Technical Reports Server (NTRS)

Zaghloul, Mona E.; Rhee, Do Jun

1989-01-01

A computer model that simulates the response of photomultiplier tubes (PMTs) and the associated voltage divider circuit is developed. An equivalent circuit that approximates the operation of the device is derived and then used to develop a computer simulation of the PMT. Simulation results are presented and discussed.

5. Computational calculation of equilibrium constants: addition to carbonyl compounds.

PubMed

Gómez-Bombarelli, Rafael; González-Pérez, Marina; Pérez-Prior, María Teresa; Calle, Emilio; Casado, Julio

2009-10-22

Hydration reactions are relevant for understanding many organic mechanisms. Since the experimental determination of hydration and hemiacetalization equilibrium constants is fairly complex, computational calculations now offer a useful alternative to experimental measurements. In this work, carbonyl hydration and hemiacetalization constants were calculated from the free energy differences between compounds in solution, using absolute and relative approaches. The following conclusions can be drawn: (i) The use of a relative approach in the calculation of hydration and hemiacetalization constants allows compensation of systematic errors in the solvation energies. (ii) On average, the methodology proposed here can predict hydration constants within +/- 0.5 log K(hyd) units for aldehydes. (iii) Hydration constants can be calculated for ketones and carboxylic acid derivatives within less than +/- 1.0 log K(hyd), on average, at the CBS-Q level of theory. (iv) The proposed methodology can predict hemiacetal formation constants accurately at the MP2 6-31++G(d,p) level using a common reference. If group references are used, the results obtained using the much cheaper DFT-B3LYP 6-31++G(d,p) level are almost as accurate. (v) In general, the best results are obtained if a common reference for all compounds is used. The use of group references improves the results at the lower levels of theory, but at higher levels, this becomes unnecessary.

6. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

SciTech Connect

Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Temi Linjewile; Mike Maguire; Adel Sarofim; Connie Senior; Changguan Yang; Hong-Shig Shim

2004-04-28

This is the fourteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused primarily on completing a prototype detachable user interface for the framework and on integrating Carnegie Mellon Universities IECM model core with the computational engine. In addition to this work, progress has been made on several other development and modeling tasks for the program. These include: (1) improvements to the infrastructure code of the computational engine, (2) enhancements to the model interfacing specifications, (3) additional development to increase the robustness of all framework components, (4) enhanced coupling of the computational and visualization engine components, (5) a series of detailed simulations studying the effects of gasifier inlet conditions on the heat flux to the gasifier injector, and (6) detailed plans for implementing models for mercury capture for both warm and cold gas cleanup have been created.

7. Computer simulation and the features of novel empirical data.

PubMed

Lusk, Greg

2016-04-01

In an attempt to determine the epistemic status of computer simulation results, philosophers of science have recently explored the similarities and differences between computer simulations and experiments. One question that arises is whether and, if so, when, simulation results constitute novel empirical data. It is often supposed that computer simulation results could never be empirical or novel because simulations never interact with their targets, and cannot go beyond their programming. This paper argues against this position by examining whether, and under what conditions, the features of empiricality and novelty could be displayed by computer simulation data. I show that, to the extent that certain familiar measurement results have these features, so can some computer simulation results.

8. Towards the design of new and improved drilling fluid additives using molecular dynamics simulations.

PubMed

Anderson, Richard L; Greenwel, H Christopher; Suter, James L; Jarvis, Rebecca M; Coveney, Peter V

2010-03-01

During exploration for oil and gas, a technical drilling fluid is used to lubricate the drill bit, maintain hydrostatic pressure, transmit sensor readings, remove rock cuttings and inhibit swelling of unstable clay based reactive shale formations. Increasing environmental awareness and resulting legislation has led to the search for new, improved biodegradable drilling fluid components. In the case of additives for clay swelling inhibition, an understanding of how existing effective additives interact with clays must be gained to allow the design of improved molecules. Owing to the disordered nature and nanoscopic dimension of the interlayer pores of clay minerals, computer simulations have become an increasingly useful tool for studying clay-swelling inhibitor interactions. In this work we briefly review the history of the development of technical drilling fluids, the environmental impact of drilling fluids and the use of computer simulations to study the interactions between clay minerals and swelling inhibitors. We report on results from some recent large-scale molecular dynamics simulation studies on low molecular weight water-soluble macromolecular inhibitor molecules. The structure and interactions of poly(propylene oxide)-diamine, poly(ethylene glycol) and poly(ethylene oxide)-diacrylate inhibitor molecules with montmorillonite clay are studied.

9. Simulating granular media on the computer

Herrmann, H. J.

Granular materials, like sand or powder, can present very intriguing effects. When shaken, sheared or poured they show segregation, convection and spontaneous fluctuations in densities and stresses. I will discuss the modeling of a granular medium on a computer by simulating a packing of elastic spheres via Molecular Dynamics. Dissipation of energy and shear friction at collisions are included. In the physical range the friction coefficient is found to be a linear function of the angle of repose. On a vibrating plate the formation of convection cells due to walls or amplitude modulations can be observed. The onset of fluidization can be determined and is in good agreement with experiments. Segregation of larger particles is found to be always accompanied by convection cells. There is also ample experimental evidence showing the existence of spontaneous density patterns in granular material flowing through pipes or hoppers. The Molecular Dynamics simulations show that these density fluctuations follow a 1/f α spectrum. I compare this behavior to deterministic one-dimensional traffic models. A model with continuous positions and velocities shows self-organized critical jamming behind a slower car. The experimentally observed effects are also reproduced by Lattice Gas and Boltzmann Lattice Models. Density waves are spontaneously generated when the viscosity has a nonlinear dependence on density which characterizes granular flow. We also briefly sketch a thermodynamic formalism for loose granular material. In a dense packing non-linear acoustic phenomena, like the pressure dependence of the sound velocity are studied. Finally the plastic shear bands occurring in large scale deformations of compactified granular media are investigated using an explicit Lagrangian technique.

10. Computer simulation of vasectomy for wolf control

USGS Publications Warehouse

Haight, R.G.; Mech, L.D.

1997-01-01

Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.

11. Computer simulation of FCC riser reactors.

SciTech Connect

Chang, S. L.; Golchert, B.; Lottes, S. A.; Petrick, M.; Zhou, C. Q.

1999-04-20

A three-dimensional computational fluid dynamics (CFD) code, ICRKFLO, was developed to simulate the multiphase reacting flow system in a fluid catalytic cracking (FCC) riser reactor. The code solve flow properties based on fundamental conservation laws of mass, momentum, and energy for gas, liquid, and solid phases. Useful phenomenological models were developed to represent the controlling FCC processes, including droplet dispersion and evaporation, particle-solid interactions, and interfacial heat transfer between gas, droplets, and particles. Techniques were also developed to facilitate numerical calculations. These techniques include a hybrid flow-kinetic treatment to include detailed kinetic calculations, a time-integral approach to overcome numerical stiffness problems of chemical reactions, and a sectional coupling and blocked-cell technique for handling complex geometry. The copyrighted ICRKFLO software has been validated with experimental data from pilot- and commercial-scale FCC units. The code can be used to evaluate the impacts of design and operating conditions on the production of gasoline and other oil products.

12. Computer Simulation Methods for Defect Configurations and Nanoscale Structures

SciTech Connect

Gao, Fei

2010-01-01

This chapter will describe general computer simulation methods, including ab initio calculations, molecular dynamics and kinetic Monte-Carlo method, and their applications to the calculations of defect configurations in various materials (metals, ceramics and oxides) and the simulations of nanoscale structures due to ion-solid interactions. The multiscale theory, modeling, and simulation techniques (both time scale and space scale) will be emphasized, and the comparisons between computer simulation results and exprimental observations will be made.

13. Computing Environment for Adaptive Multiscale Simulation

DTIC Science & Technology

2014-09-24

Computation Research Center (SCOREC). The primary component is a parallel computing cluster with 22 Dell R620 compute nodes, each with two 8-core...cluster with 22 Dell R620 compute nodes, each with two 8-core 2.6 GHz Intel Xeon processors (352 processors) and a direct connection to both a 56Gbps...compute  cluster  purchased  with  the  DURIP  funds  consists  of  22   Dell  R620  compute  nodes,  each  with  two  8

14. Computer simulations for internal dosimetry using voxel models.

PubMed

Kinase, Sakae; Mohammadi, Akram; Takahashi, Masa; Saito, Kimiaki; Zankl, Maria; Kramer, Richard

2011-07-01

In the Japan Atomic Energy Agency, several studies have been conducted on the use of voxel models for internal dosimetry. Absorbed fractions (AFs) and S values have been evaluated for preclinical assessments of radiopharmaceuticals using human voxel models and a mouse voxel model. Computational calibration of in vivo measurement system has been also made using Japanese and Caucasian voxel models. In addition, for radiation protection of the environment, AFs have been evaluated using a frog voxel model. Each study was performed by using Monte Carlo simulations. Consequently, it was concluded that these data of Monte Carlo simulations and voxel models could adequately reproduce measurement results. Voxel models were found to be a significant tool for internal dosimetry since the models are anatomically realistic. This fact indicates that several studies on correction of the in vivo measurement efficiency for the variability of human subjects and interspecies scaling of organ doses will succeed.

15. Towards mitigating Asynchronous Computing effects in largescale simulations

Mittal, Ankita; Girimaji, Sharath

2016-11-01

Synchronization of processing elements (PEs) in massively parallel simulations has shown to significantly affect scalability of scientific applications. Relaxing this synchronization among PEs (asynchronous) conserves the stability condition but severely affects the accuracy reducing the average error to first-order regardless of the original scheme. At the present time, several approaches are under consideration to improve the order of asynchronous computations. In this work, we propose to modify the original governing equation to obtain a Proxy-Equation which when solved asynchronously recovers the order of accuracy of the original numerical scheme. Performing 1D simulations for the Advection Diffusion Equation, we observe that the wave speed and the viscosity must be increased in the vicinity of PE boundaries in order to counteract the effect of asynchrony. In addition to recovering accuracy, this method shows lower magnitudes of average error when compared to existing asynchrony-tolerant methods. Similar results are also presented for a 1D viscous Burgers equation.

16. Multipole Algorithms for Molecular Dynamics Simulation on High Performance Computers.

Elliott, William Dewey

1995-01-01

A fundamental problem in modeling large molecular systems with molecular dynamics (MD) simulations is the underlying N-body problem of computing the interactions between all pairs of N atoms. The simplest algorithm to compute pair-wise atomic interactions scales in runtime {cal O}(N^2), making it impractical for interesting biomolecular systems, which can contain millions of atoms. Recently, several algorithms have become available that solve the N-body problem by computing the effects of all pair-wise interactions while scaling in runtime less than {cal O}(N^2). One algorithm, which scales {cal O}(N) for a uniform distribution of particles, is called the Greengard-Rokhlin Fast Multipole Algorithm (FMA). This work describes an FMA-like algorithm called the Molecular Dynamics Multipole Algorithm (MDMA). The algorithm contains several features that are new to N-body algorithms. MDMA uses new, efficient series expansion equations to compute general 1/r^{n } potentials to arbitrary accuracy. In particular, the 1/r Coulomb potential and the 1/r^6 portion of the Lennard-Jones potential are implemented. The new equations are based on multivariate Taylor series expansions. In addition, MDMA uses a cell-to-cell interaction region of cells that is closely tied to worst case error bounds. The worst case error bounds for MDMA are derived in this work also. These bounds apply to other multipole algorithms as well. Several implementation enhancements are described which apply to MDMA as well as other N-body algorithms such as FMA and tree codes. The mathematics of the cell -to-cell interactions are converted to the Fourier domain for reduced operation count and faster computation. A relative indexing scheme was devised to locate cells in the interaction region which allows efficient pre-computation of redundant information and prestorage of much of the cell-to-cell interaction. Also, MDMA was integrated into the MD program SIgMA to demonstrate the performance of the program over

17. Insights from molecular dynamics simulations for computational protein design.

PubMed

Childers, Matthew Carter; Daggett, Valerie

2017-02-01

A grand challenge in the field of structural biology is to design and engineer proteins that exhibit targeted functions. Although much success on this front has been achieved, design success rates remain low, an ever-present reminder of our limited understanding of the relationship between amino acid sequences and the structures they adopt. In addition to experimental techniques and rational design strategies, computational methods have been employed to aid in the design and engineering of proteins. Molecular dynamics (MD) is one such method that simulates the motions of proteins according to classical dynamics. Here, we review how insights into protein dynamics derived from MD simulations have influenced the design of proteins. One of the greatest strengths of MD is its capacity to reveal information beyond what is available in the static structures deposited in the Protein Data Bank. In this regard simulations can be used to directly guide protein design by providing atomistic details of the dynamic molecular interactions contributing to protein stability and function. MD simulations can also be used as a virtual screening tool to rank, select, identify, and assess potential designs. MD is uniquely poised to inform protein design efforts where the application requires realistic models of protein dynamics and atomic level descriptions of the relationship between dynamics and function. Here, we review cases where MD simulations was used to modulate protein stability and protein function by providing information regarding the conformation(s), conformational transitions, interactions, and dynamics that govern stability and function. In addition, we discuss cases where conformations from protein folding/unfolding simulations have been exploited for protein design, yielding novel outcomes that could not be obtained from static structures.

18. Prediction of RNA secondary structure, including pseudoknotting, by computer simulation.

PubMed Central

Abrahams, J P; van den Berg, M; van Batenburg, E; Pleij, C

1990-01-01

A computer program is presented which determines the secondary structure of linear RNA molecules by simulating a hypothetical process of folding. This process implies the concept of 'nucleation centres', regions in RNA which locally trigger the folding. During the simulation, the RNA is allowed to fold into pseudoknotted structures, unlike all other programs predicting RNA secondary structure. The simulation uses published, experimentally determined free energy values for nearest neighbour base pair stackings and loop regions, except for new extrapolated values for loops larger than seven nucleotides. The free energy value for a loop arising from pseudoknot formation is set to a single, estimated value of 4.2 kcal/mole. Especially in the case of long RNA sequences, our program appears superior to other secondary structure predicting programs described so far, as tests on tRNAs, the LSU intron of Tetrahymena thermophila and a number of plant viral RNAs show. In addition, pseudoknotted structures are often predicted successfully. The program is written in mainframe APL and is adapted to run on IBM compatible PCs, Atari ST and Macintosh personal computers. On an 8 MHz 8088 standard PC without coprocessor, using STSC APL, it folds a sequence of 700 nucleotides in one and a half hour. PMID:1693421

19. Simulation of reliability in multiserver computer networks

Minkevičius, Saulius

2012-11-01

The performance in terms of reliability of computer multiserver networks motivates this paper. The probability limit theorem is derived on the extreme queue length in open multiserver queueing networks in heavy traffic and applied to a reliability model for multiserver computer networks where we relate the time of failure of a multiserver computer network to the system parameters.

20. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

2015-12-01

The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

1. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

SciTech Connect

King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

2015-12-29

The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

2. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

Goodwin, Bruce

2015-03-01

This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

3. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

SciTech Connect

King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

2015-12-15

The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

4. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

DOE PAGES

King, W. E.; Anderson, A. T.; Ferencz, R. M.; ...

2015-12-29

The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less

NASA Technical Reports Server (NTRS)

Smalheer, C. V.

1973-01-01

The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

6. Computer simulator for a mobile telephone system

NASA Technical Reports Server (NTRS)

Schilling, D. L.; Ziegler, C.

1983-01-01

A software simulator to help NASA in the design of the LMSS was developed. The simulator will be used to study the characteristics of implementation requirements of the LMSS's configuration with specifications as outlined by NASA.

7. How Effective Is Instructional Support for Learning with Computer Simulations?

ERIC Educational Resources Information Center

Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

2013-01-01

The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

8. New Pedagogies on Teaching Science with Computer Simulations

ERIC Educational Resources Information Center

Khan, Samia

2011-01-01

Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

9. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

ERIC Educational Resources Information Center

Nelson, Jorge O.

This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

10. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

ERIC Educational Resources Information Center

Jolly, Laura D.; Sisler, Grovalynn

1988-01-01

The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

11. The Role of Computer Simulations in Engineering Education.

ERIC Educational Resources Information Center

Smith, P. R.; Pollard, D.

1986-01-01

Discusses role of computer simulation in complementing and extending conventional components of undergraduate engineering education process in United Kingdom universities and polytechnics. Aspects of computer-based learning are reviewed (laboratory simulation, lecture and tutorial support, inservice teacher education) with reference to programs in…

12. Nonlinear simulations with and computational issues for NIMROD

SciTech Connect

Sovinec, C.R.

1998-12-31

The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

13. Simulation models for computational plasma physics: Concluding report

SciTech Connect

Hewett, D.W.

1994-03-05

In this project, the authors enhanced their ability to numerically simulate bounded plasmas that are dominated by low-frequency electric and magnetic fields. They moved towards this goal in several ways; they are now in a position to play significant roles in the modeling of low-frequency electromagnetic plasmas in several new industrial applications. They have significantly increased their facility with the computational methods invented to solve the low frequency limit of Maxwell`s equations (DiPeso, Hewett, accepted, J. Comp. Phys., 1993). This low frequency model is called the Streamlined Darwin Field model (SDF, Hewett, Larson, and Doss, J. Comp. Phys., 1992) has now been implemented in a fully non-neutral SDF code BEAGLE (Larson, Ph.D. dissertation, 1993) and has further extended to the quasi-neutral limit (DiPeso, Hewett, Comp. Phys. Comm., 1993). In addition, they have resurrected the quasi-neutral, zero-electron-inertia model (ZMR) and began the task of incorporating internal boundary conditions into this model that have the flexibility of those in GYMNOS, a magnetostatic code now used in ion source work (Hewett, Chen, ICF Quarterly Report, July--September, 1993). Finally, near the end of this project, they invented a new type of banded matrix solver that can be implemented on a massively parallel computer -- thus opening the door for the use of all their ADI schemes on these new computer architecture`s (Mattor, Williams, Hewett, submitted to Parallel Computing, 1993).

14. Computers for real time flight simulation: A market survey

NASA Technical Reports Server (NTRS)

Bekey, G. A.; Karplus, W. J.

1977-01-01

An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

15. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

NASA Technical Reports Server (NTRS)

Stocker, John C.; Golomb, Andrew M.

2011-01-01

Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

16. Case Studies in Computer Adaptive Test Design through Simulation.

ERIC Educational Resources Information Center

Eignor, Daniel R.; And Others

The extensive computer simulation work done in developing the computer adaptive versions of the Graduate Record Examinations (GRE) Board General Test and the College Board Admissions Testing Program (ATP) Scholastic Aptitude Test (SAT) is described in this report. Both the GRE General and SAT computer adaptive tests (CATs), which are fixed length…

17. Interactive Electronic Circuit Simulation on Small Computer Systems

DTIC Science & Technology

1979-11-01

this is the most effective way of completing a computer-aided engineering design cycle. Compar- isons of the interactive versus batch simulation...run on almost any computer system with few if any modifications. Also included are the four benchmark test circuits which were used in many of the...the ensuing FORTRAN version. 2.2 Circuit Simulation Using BIAS-D (BASIC Version) Any circuit-simulation program can be di- vided into three

18. Matching of additive and polarizable force fields for multiscale condensed phase simulations

PubMed Central

Baker, Christopher M.; Best, Robert B.

2013-01-01

Inclusion of electronic polarization effects is one of the key aspects in which the accuracy of current biomolecular force fields may be improved. The principal drawback of such approaches is the computational cost, which typically ranges from 3 – 10 times that of the equivalent additive model, and may be greater for more sophisticated treatments of polarization or other many-body effects. Here, we present a multiscale approach which may be used to enhance the sampling in simulations with polarizable models, by using the additive model as a tool to explore configuration space. We use a method based on information theory to determine the charges for an additive model that has optimal overlap with the polarizable one, and we demonstrate the feasibility of enhancing sampling via a hybrid replica exchange scheme for several model systems. An additional advantage is that, in the process, we obtain a systematic method for deriving charges for an additive model that will be the natural complement to its polarizable parent. The additive charges are found by an effective coarse-graining of the polarizable force field, rather than by ad hoc procedures. PMID:23997691

19. Digital computer simulation of synthetic aperture systems and images

Camporeale, Claudio; Galati, Gaspare

1991-06-01

Digital computer simulation is a powerful tool for the design, the mission planning and the image quality analysis of advanced SAR Systems. 'End-to-end' simulators describe the whole process of the SAR imaging including the generation of the coherent echoes and their processing and allow, unlike the 'product simulators', to evaluate the effects of the various impairments on the final image. The main disadvantage of the 'end-to-end' approach, as described in this paper, is the heavy computation burden; therefore, a new type of simulator is presented, attempting to reduce the burden but presenting a greater degree of completeness and realism than the SAR product simulators, already existing.

20. Creating Science Simulations through Computational Thinking Patterns

ERIC Educational Resources Information Center

Basawapatna, Ashok Ram

2012-01-01

Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

1. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

NASA Technical Reports Server (NTRS)

Khayat, Michael A.

2011-01-01

The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

2. A Review of Computer Simulations in Teacher Education

ERIC Educational Resources Information Center

2014-01-01

Computer simulations can provide guided practice for a variety of situations that pre-service teachers would not frequently experience during their teacher education studies. Pre-service teachers can use simulations to turn the knowledge they have gained in their coursework into real experience. Teacher simulation training has come a long way over…

3. Computer Simulations as an Integral Part of Intermediate Macroeconomics.

ERIC Educational Resources Information Center

Millerd, Frank W.; Robertson, Alastair R.

1987-01-01

Describes the development of two interactive computer simulations which were fully integrated with other course materials. The simulations illustrate the effects of various real and monetary "demand shocks" on aggregate income, interest rates, and components of spending and economic output. Includes an evaluation of the simulations'…

4. Genetic crossing vs cloning by computer simulation

SciTech Connect

Dasgupta, S.

1997-06-01

We perform Monte Carlo simulation using Penna`s bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

5. Spatial Learning and Computer Simulations in Science

ERIC Educational Resources Information Center

Lindgren, Robb; Schwartz, Daniel L.

2009-01-01

Interactive simulations are entering mainstream science education. Their effects on cognition and learning are often framed by the legacy of information processing, which emphasized amodal problem solving and conceptual organization. In contrast, this paper reviews simulations from the vantage of research on perception and spatial learning,…

6. Computer formulations of aircraft models for simulation studies

NASA Technical Reports Server (NTRS)

Howard, J. C.

1979-01-01

Recent developments in formula manipulation compilers and the design of several symbol manipulation languages, enable computers to be used for symbolic mathematical computation. A computer system and language that can be used to perform symbolic manipulations in an interactive mode are used to formulate a mathematical model of an aeronautical system. The example demonstrates that once the procedure is established, the formulation and modification of models for simulation studies can be reduced to a series of routine computer operations.

7. Radiotherapy Monte Carlo simulation using cloud computing technology.

PubMed

Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

2012-12-01

Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

8. Methodology of modeling and measuring computer architectures for plasma simulations

NASA Technical Reports Server (NTRS)

Wang, L. P. T.

1977-01-01

A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

9. Computer-Based Simulation for Man-Computer System Design,

DTIC Science & Technology

1980-02-01

simulations to Investigate huwan factors and crew size (2). The experiment design was a three- problem posed by man omputer interactions in proposed ...hesrighat of t the reflected in iess flying ties, fewer Instances of high Lto are carfger The lD haud tlo desilt itr wihthe speed chis,* and hence, reduced

10. High Fidelity Simulation of a Computer Room

NASA Technical Reports Server (NTRS)

2005-01-01

This viewgraph presentation reviews NASA's Columbia supercomputer and the mesh technology used to test the adequacy of the fluid and cooling of a computer room. A technical description of the Columbia supercomputer is also presented along with its performance capability.

11. Neutron stimulated emission computed tomography: a Monte Carlo simulation approach.

PubMed

Sharma, A C; Harrawood, B P; Bender, J E; Tourassi, G D; Kapadia, A J

2007-10-21

A Monte Carlo simulation has been developed for neutron stimulated emission computed tomography (NSECT) using the GEANT4 toolkit. NSECT is a new approach to biomedical imaging that allows spectral analysis of the elements present within the sample. In NSECT, a beam of high-energy neutrons interrogates a sample and the nuclei in the sample are stimulated to an excited state by inelastic scattering of the neutrons. The characteristic gammas emitted by the excited nuclei are captured in a spectrometer to form multi-energy spectra. Currently, a tomographic image is formed using a collimated neutron beam to define the line integral paths for the tomographic projections. These projection data are reconstructed to form a representation of the distribution of individual elements in the sample. To facilitate the development of this technique, a Monte Carlo simulation model has been constructed from the GEANT4 toolkit. This simulation includes modeling of the neutron beam source and collimation, the samples, the neutron interactions within the samples, the emission of characteristic gammas, and the detection of these gammas in a Germanium crystal. In addition, the model allows the absorbed radiation dose to be calculated for internal components of the sample. NSECT presents challenges not typically addressed in Monte Carlo modeling of high-energy physics applications. In order to address issues critical to the clinical development of NSECT, this paper will describe the GEANT4 simulation environment and three separate simulations performed to accomplish three specific aims. First, comparison of a simulation to a tomographic experiment will verify the accuracy of both the gamma energy spectra produced and the positioning of the beam relative to the sample. Second, parametric analysis of simulations performed with different user-defined variables will determine the best way to effectively model low energy neutrons in tissue, which is a concern with the high hydrogen content in

12. Atomistic Simulations of Ti Additions to NiAl

NASA Technical Reports Server (NTRS)

Bozzolo, Guillermo; Noebe, Ronald D.; Garg, Anita; Ferrante, John; Amador, Carlos

1997-01-01

The development of more efficient engines and power plants for future supersonic transports depends on the advancement of new high-temperature materials with temperature capabilities exceeding those of Ni-based superalloys. Having theoretical modelling techniques to aid in the design of these alloys would greatly facilitate this development. The present paper discusses a successful attempt to correlate theoretical predictions of alloy properties with experimental confirmation for ternary NiAl-Ti alloys. The B.F.S. (Bozzolo-Ferrante-Smith) method for alloys is used to predict the solubility limit and site preference energies for Ti additions of 1 to 25 at.% to NiAl. The results show the solubility limit to be around 5% Ti, above which the formation of Heusler precipitates is favored. These results were confirmed by transmission electron microscopy performed on a series of NiAl-Ti alloys.

13. Some theoretical issues on computer simulations

SciTech Connect

Barrett, C.L.; Reidys, C.M.

1998-02-01

The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.

14. Simulation models for computational plasma physics: Concluding report

Hewett, D. W.

1994-03-01

In this project, the authors enhanced their ability to numerically simulate bounded plasmas that are dominated by low-frequency electric and magnetic fields. They moved towards this goal in several ways; they are now in a position to play significant roles in the modeling of low-frequency electromagnetic plasmas in several new industrial applications. They have significantly increased their facility with the computational methods invented to solve the low frequency limit of Maxwell's equations. This low frequency model, called the streamlined Darwin field model, has now been implemented in a fully non-neutral SDF code BEAGLE and has been further extended to the quasi-neutral limit. In addition, they have resurrected the quasi-neutral, zero electron inertia model (ZMR) and began the task of incorporating internal boundary conditions into this model that have the flexibility of those in GYMNOS, a magnetostatic code now used in ion source work. Finally, near the end of this project, they invented a new type of banded matrix solver that can be implemented on a massively parallel computer, thus opening the door for the use of all their ADI schemes on these new computer architecture's.

15. Computer simulation results of attitude estimation of earth orbiting satellites

NASA Technical Reports Server (NTRS)

Kou, S. R.

1976-01-01

Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.

16. Computer Simulation Performed for Columbia Project Cooling System

NASA Technical Reports Server (NTRS)

2005-01-01

This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

17. Additions and Improvements to the FLASH Code for Simulating High Energy Density Physics Experiments

Lamb, D. Q.; Daley, C.; Dubey, A.; Fatenejad, M.; Flocke, N.; Graziani, C.; Lee, D.; Tzeferacos, P.; Weide, K.

2015-11-01

FLASH is an open source, finite-volume Eulerian, spatially adaptive radiation hydrodynamics and magnetohydrodynamics code that incorporates capabilities for a broad range of physical processes, performs well on a wide range of computer architectures, and has a broad user base. Extensive capabilities have been added to FLASH to make it an open toolset for the academic high energy density physics (HEDP) community. We summarize these capabilities, with particular emphasis on recent additions and improvements. These include advancements in the optical ray tracing laser package, with methods such as bi-cubic 2D and tri-cubic 3D interpolation of electron number density, adaptive stepping and 2nd-, 3rd-, and 4th-order Runge-Kutta integration methods. Moreover, we showcase the simulated magnetic field diagnostic capabilities of the code, including induction coils, Faraday rotation, and proton radiography. We also describe several collaborations with the National Laboratories and the academic community in which FLASH has been used to simulate HEDP experiments. This work was supported in part at the University of Chicago by the DOE NNSA ASC through the Argonne Institute for Computing in Science under field work proposal 57789; and the NSF under grant PHY-0903997.

18. A scalable parallel black oil simulator on distributed memory parallel computers

Wang, Kun; Liu, Hui; Chen, Zhangxin

2015-11-01

This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

19. Super-computer simulation for galaxy formation

Jing, Yipeng

2001-06-01

Numerical simulations are widely used in the studies of galaxy formation. Here we briefly review their important role in the galaxy formation research, their relations with analytical models, and their limitations as well. Then a progress report is given about our collaboration with a group in the University of Tokyo, including the simulation samples we have obtained, some of the results we have published, and the joint projects which are in progress.

20. Computer simulation of water reclamation processors

NASA Technical Reports Server (NTRS)

Fisher, John W.; Hightower, T. M.; Flynn, Michael T.

1991-01-01

The development of detailed simulation models of water reclamation processors based on the ASPEN PLUS simulation program is discussed. Individual models have been developed for vapor compression distillation, vapor phase catalytic ammonia removal, and supercritical water oxidation. These models are used for predicting the process behavior. Particular attention is given to methodology which is used to complete this work, and the insights which are gained by this type of model development.

1. Computer simulation of current voltage response of electrocatalytic sensor

Jasinski, Piotr; Jasinski, Grzegorz; Chachulski, Bogdan; Nowakowski, Antoni

2003-09-01

In the present paper, results of computer simulation of cyclic voltammetry applied to electrocatalytic solid state sensor are presented. The computer software developed by D.Gosser is based on explicit finite difference method. The software is devoted for the simulation of cyclic voltammetry experiments in liquid electrochemistry. However the software is based on general electrochemical rules and may be used for simulation of experiments in solid state electrochemistry. The electrocatalytic sensor does not have a reference electrode and therefore it is necessary to employ virtual reference electrode into the model of the sensor. Data obtained from simulation are similar to measurement one what confirms correctness of assumed sensing mechanism.

2. Two inviscid computational simulations of separated flow about airfoils

NASA Technical Reports Server (NTRS)

Barnwell, R. W.

1976-01-01

Two inviscid computational simulations of separated flow about airfoils are described. The basic computational method is the line relaxation finite-difference method. Viscous separation is approximated with inviscid free-streamline separation. The point of separation is specified, and the pressure in the separation region is calculated. In the first simulation, the empiricism of constant pressure in the separation region is employed. This empiricism is easier to implement with the present method than with singularity methods. In the second simulation, acoustic theory is used to determine the pressure in the separation region. The results of both simulations are compared with experiment.

3. Icing simulation: A survey of computer models and experimental facilities

NASA Technical Reports Server (NTRS)

Potapczuk, M. G.; Reinmann, J. J.

1991-01-01

A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

4. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing.

PubMed

Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

2017-01-08

Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4_ speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration.

5. A Fast Synthetic Aperture Radar Raw Data Simulation Using Cloud Computing

PubMed Central

Li, Zhixin; Su, Dandan; Zhu, Haijiang; Li, Wei; Zhang, Fan; Li, Ruirui

2017-01-01

Synthetic Aperture Radar (SAR) raw data simulation is a fundamental problem in radar system design and imaging algorithm research. The growth of surveying swath and resolution results in a significant increase in data volume and simulation period, which can be considered to be a comprehensive data intensive and computing intensive issue. Although several high performance computing (HPC) methods have demonstrated their potential for accelerating simulation, the input/output (I/O) bottleneck of huge raw data has not been eased. In this paper, we propose a cloud computing based SAR raw data simulation algorithm, which employs the MapReduce model to accelerate the raw data computing and the Hadoop distributed file system (HDFS) for fast I/O access. The MapReduce model is designed for the irregular parallel accumulation of raw data simulation, which greatly reduces the parallel efficiency of graphics processing unit (GPU) based simulation methods. In addition, three kinds of optimization strategies are put forward from the aspects of programming model, HDFS configuration and scheduling. The experimental results show that the cloud computing based algorithm achieves 4× speedup over the baseline serial approach in an 8-node cloud environment, and each optimization strategy can improve about 20%. This work proves that the proposed cloud algorithm is capable of solving the computing intensive and data intensive issues in SAR raw data simulation, and is easily extended to large scale computing to achieve higher acceleration. PMID:28075343

6. Computer simulator for a mobile telephone system

NASA Technical Reports Server (NTRS)

Schilling, D. L.

1981-01-01

A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

7. An Exercise in Biometrical Genetics Based on a Computer Simulation.

ERIC Educational Resources Information Center

Murphy, P. J.

1983-01-01

Describes an exercise in biometrical genetics based on the noninteractive use of a computer simulation of a wheat hydridization program. Advantages of using the material in this way are also discussed. (Author/JN)

8. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

PubMed

Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

2016-10-01

Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix.

9. MINEXP, A Computer-Simulated Mineral Exploration Program

ERIC Educational Resources Information Center

Smith, Michael J.; And Others

1978-01-01

This computer simulation is designed to put students into a realistic decision making situation in mineral exploration. This program can be used with different exploration situations such as ore deposits, petroleum, ground water, etc. (MR)

10. A Computing Cluster for Numerical Simulation

DTIC Science & Technology

2006-10-23

34Contact and Friction for Cloth Animation", SIGGRAPH 2002, ACM TOG 21, 594-603 (2002). "* [BHTF] Bao, Z., Hong, J.-M., Teran , J. and Fedkiw, R...Simulation of Large Bodies of Water by Coupling Two and Three Dimensional Techniques", SIGGRAPH 2006, ACM TOG 25, 805-811 (2006). "* [ITF] Irving, G., Teran ...O’Brien (2006) "* [TSBNLF] Teran , J., Sifakis, E., Blemker, S., Ng Thow Hing, V., Lau, C. and Fedkiw, R., "Creating and Simulating Skeletal Muscle from the

11. Simulation of flow in a continuous galvanizing bath: Part I. Thermal effects of ingot addition

Ajersch, F.; Ilinca, F.; Hétu, J.-F.

2004-02-01

A numerical analysis has been developed to simulate the velocity and temperature fields in an industrial galvanizing bath for the continuous coating of steel strip. Operating variables such as ingot addition, line speed, and inductor mixing were evaluated in order to determine their effect on the velocity and temperature distribution in the bath. The simulations were carried out using high-performance computational fluid-dynamics software developed at the Industrial Materials Institute of the National Research Council Canada (IMI-NRC) in solving the incompressible Navier-Stokes equations for steady-state and transient turbulent flow using the k-ɛ model. Cases with and without temperature-dependent density conditions were considered. It was found that the strip velocity does not alter the global flow pattern but modifies the velocities in the snout, near the strip, and near the sink and guide rolls. At a low inductor capacity, the effect of induced mixing is small but is considerably increased at the maximum inductor capacities used during ingot-melting periods. When considering the thermal effects, the flow is affected by variations in density especially near the inductors and the ingot, while little effect is observed near the sheet-and-roller region. Thermal effects are also amplified when the inductor operates at high capacity during ingot melting. The simulations allow visualization of regions of varying velocity and temperature fields and clearly illustrate the mixed and stagnant zones for different operating conditions.

12. Computational Fluid Dynamic simulations of pipe elbow flow.

SciTech Connect

Homicz, Gregory Francis

2004-08-01

One problem facing today's nuclear power industry is flow-accelerated corrosion and erosion in pipe elbows. The Korean Atomic Energy Research Institute (KAERI) is performing experiments in their Flow-Accelerated Corrosion (FAC) test loop to better characterize these phenomena, and develop advanced sensor technologies for the condition monitoring of critical elbows on a continuous basis. In parallel with these experiments, Sandia National Laboratories is performing Computational Fluid Dynamic (CFD) simulations of the flow in one elbow of the FAC test loop. The simulations are being performed using the FLUENT commercial software developed and marketed by Fluent, Inc. The model geometry and mesh were created using the GAMBIT software, also from Fluent, Inc. This report documents the results of the simulations that have been made to date; baseline results employing the RNG k-e turbulence model are presented. The predicted value for the diametrical pressure coefficient is in reasonably good agreement with published correlations. Plots of the velocities, pressure field, wall shear stress, and turbulent kinetic energy adjacent to the wall are shown within the elbow section. Somewhat to our surprise, these indicate that the maximum values of both wall shear stress and turbulent kinetic energy occur near the elbow entrance, on the inner radius of the bend. Additional simulations were performed for the same conditions, but with the RNG k-e model replaced by either the standard k-{var_epsilon}, or the realizable k-{var_epsilon} turbulence model. The predictions using the standard k-{var_epsilon} model are quite similar to those obtained in the baseline simulation. However, with the realizable k-{var_epsilon} model, more significant differences are evident. The maximums in both wall shear stress and turbulent kinetic energy now appear on the outer radius, near the elbow exit, and are {approx}11% and 14% greater, respectively, than those predicted in the baseline calculation

13. Computational Simulation of Explosively Generated Pulsed Power Devices

DTIC Science & Technology

2013-03-21

COMPUTATIONAL SIMULATION OF EXPLOSIVELY GENERATED PULSED POWER DEVICES THESIS Mollie C. Drumm, Captain, USAF AFIT-ENY-13-M-11 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENY-13-M-11 COMPUTATIONAL SIMULATION OF EXPLOSIVELY GENERATED PULSED POWER DEVICES THESIS Presented to the...OF EXPLOSIVELY GENERATED PULSED POWER DEVICES Mollie C. Drumm, BS Captain, USAF Approved: Dr. Robert B. Greendyke (Chairman) Date Capt. David Liu

DTIC Science & Technology

2000-10-01

UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP010837 TITLE: Computer Simulations of Canada’s RADARSAT2 GMTI...ADPO10842 UNCLASSIFIED 45-1 Computer Simulations of Canada’s RADARSAT2 GMTI Shen Chiu and Chuck Livingstone Space Systems and Technology Section, Defence...Associates Ltd. 13800 Commerce Parkway, Richmond, B.C., Canada V6V 2J3 Abstract The detection probability and the estimation accuracy Canada’s RADARSAT2

15. GATE Monte Carlo simulation in a cloud computing environment

Rowedder, Blake Austin

The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

16. Computer simulation of gamma-ray spectra from semiconductor detectors

Lund, Jim C.; Olschner, Fred; Shah, Kanai S.

1992-12-01

Traditionally, researchers developing improved gamma ray detectors have used analytical techniques or, rarely, computer simulations to predict the performance of new detectors. However, with the advent of inexpensive personal computers, it is now possible for virtually all detector researchers to perform some form of numerical computation to predict detector performance. Although general purpose code systems for semiconductor detector performance do not yet exist, it is possible to perform many useful calculations using commercially available, general purpose numerical software packages (such as `spreadsheet' programs intended for business use). With a knowledge of the rudimentary mechanics of detector simulation most researchers, including those with no programming skills, can effectively use numerical simulation methods to predict gamma ray detector performance. In this paper we discuss the details of the numerical simulation of gamma ray detectors with the hope of communicating the simplicity and effectiveness of these methods. In particular, we discuss the steps involved in simulating the pulse height spectrum produced by a semiconductor detector.

17. Computer Simulation of Classic Studies in Psychology.

ERIC Educational Resources Information Center

This paper describes DATASIM, a comprehensive software package which generates simulated data for actual or hypothetical research designs. DATASIM is primarily intended for use in statistics and research methods courses, where it is used to generate "individualized" datasets for students to analyze, and later to correct their answers.…

18. Bodies Falling with Air Resistance: Computer Simulation.

ERIC Educational Resources Information Center

Vest, Floyd

1982-01-01

Two models are presented. The first assumes that air resistance is proportional to the velocity of the falling body. The second assumes that air resistance is proportional to the square of the velocity. A program written in BASIC that simulates the second model is presented. (MP)

SciTech Connect

Rummel, E.

2015-07-09

To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

20. The Forward Observer Personal Computer Simulator (FOPCSIM)

DTIC Science & Technology

2002-09-01

Environment (DVTE) (CD-ROM). Produced by Andy Jackson through the Combat Visual Information Center, Marine Corps Base, Quantico, Virginia. 19 Dylan ...part of VIRTE’s forward observer training simulation. 20 LCDR Dylan Schmorrow (USN), Virtual...load the conversion data. There are software applications available to rapidly generate terrain from satellite images such as the Evans and

1. Quantum chemistry simulation on quantum computers: theories and experiments.

PubMed

Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

2012-07-14

It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

2. Launch Site Computer Simulation and its Application to Processes

NASA Technical Reports Server (NTRS)

Sham, Michael D.

1995-01-01

This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

3. Monte Carlo simulations on SIMD computer architectures

SciTech Connect

Burmester, C.P.; Gronsky, R.; Wille, L.T.

1992-03-01

Algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SMM) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carlo updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures.

4. COFLO: A Computer Aid for Teaching Ecological Simulation.

ERIC Educational Resources Information Center

Le vow, Roy B.

A computer-assisted course was designed to provide students with an understanding of modeling and simulation techniques in quantitiative ecology. It deals with continuous systems and has two segments. One develops mathematical and computer tools, beginning with abstract systems and their relation to physical systems. Modeling principles are next…

5. Application Of Computer Simulation To The Entertainment Industry

Mittelman, Phillip S.

1983-10-01

Images generated by computer have started to appear in feature films (TRON, Star Trek II), in television commercials and in animated films. Of particular interest is the use of computer generated imagery which simulates the images which a real camera might have made if the imaged objects had been real.

6. Use of Computer Simulations in Microbial and Molecular Genetics.

ERIC Educational Resources Information Center

Wood, Peter

1984-01-01

Describes five computer programs: four simulations of genetic and physical mapping experiments and one interactive learning program on the genetic coding mechanism. The programs were originally written in BASIC for the VAX-11/750 V.3. mainframe computer and have been translated into Applesoft BASIC for Apple IIe microcomputers. (JN)

7. Evaluation of a Computer Simulation in a Therapeutics Case Discussion.

ERIC Educational Resources Information Center

1995-01-01

A computer program was used to simulate a case presentation in pharmacotherapeutics. Students (n=24) used their knowledge of the disease (glaucoma) and various topical agents on the computer program's formulary to "treat" the patient. Comparison of results with a control group found the method as effective as traditional case…

8. Cardiovascular Physiology Teaching: Computer Simulations vs. Animal Demonstrations.

ERIC Educational Resources Information Center

Samsel, Richard W.; And Others

1994-01-01

At the introductory level, the computer provides an effective alternative to using animals for laboratory teaching. Computer software can simulate the operation of multiple organ systems. Advantages of software include alteration of variables that are not easily changed in vivo, repeated interventions, and cost-effective hands-on student access.…

9. Teaching Macroeconomics with a Computer Simulation. Final Report.

ERIC Educational Resources Information Center

Dolbear, F. Trenery, Jr.

The study of macroeconomics--the determination and control of aggregative variables such as gross national product, unemployment and inflation--may be facilitated by the use of a computer simulation policy game. An aggregative model of the economy was constructed and programed for a computer and (hypothetical) historical data were generated. The…

10. Coached, Interactive Computer Simulations: A New Technology for Training.

ERIC Educational Resources Information Center

Hummel, Thomas J.

This paper provides an overview of a prototype simulation-centered intelligent computer-based training (CBT) system--implemented using expert system technology--which provides: (1) an environment in which trainees can learn and practice complex skills; (2) a computer-based coach or mentor to critique performance, suggest improvements, and provide…

11. Parallel computing with graphics processing units for high-speed Monte Carlo simulation of photon migration.

PubMed

Alerstam, Erik; Svensson, Tomas; Andersson-Engels, Stefan

2008-01-01

General-purpose computing on graphics processing units (GPGPU) is shown to dramatically increase the speed of Monte Carlo simulations of photon migration. In a standard simulation of time-resolved photon migration in a semi-infinite geometry, the proposed methodology executed on a low-cost graphics processing unit (GPU) is a factor 1000 faster than simulation performed on a single standard processor. In addition, we address important technical aspects of GPU-based simulations of photon migration. The technique is expected to become a standard method in Monte Carlo simulations of photon migration.

12. Computational Aerothermodynamic Simulation Issues on Unstructured Grids

NASA Technical Reports Server (NTRS)

Gnoffo, Peter A.; White, Jeffery A.

2004-01-01

The synthesis of physical models for gas chemistry and turbulence from the structured grid codes LAURA and VULCAN into the unstructured grid code FUN3D is described. A directionally Symmetric, Total Variation Diminishing (STVD) algorithm and an entropy fix (eigenvalue limiter) keyed to local cell Reynolds number are introduced to improve solution quality for hypersonic aeroheating applications. A simple grid-adaptation procedure is incorporated within the flow solver. Simulations of flow over an ellipsoid (perfect gas, inviscid), Shuttle Orbiter (viscous, chemical nonequilibrium) and comparisons to the structured grid solvers LAURA (cylinder, Shuttle Orbiter) and VULCAN (flat plate) are presented to show current capabilities. The quality of heating in 3D stagnation regions is very sensitive to algorithm options in general, high aspect ratio tetrahedral elements complicate the simulation of high Reynolds number, viscous flow as compared to locally structured meshes aligned with the flow.

13. Phase diagram of silica from computer simulation

Saika-Voivod, Ivan; Sciortino, Francesco; Grande, Tor; Poole, Peter H.

2004-12-01

We evaluate the phase diagram of the “BKS” potential [van Beest, Kramer, and van Santen, Phys. Rev. Lett. 64, 1955 (1990)], a model of silica widely used in molecular dynamics (MD) simulations. We conduct MD simulations of the liquid, and three crystals ( β -quartz, coesite, and stishovite) over wide ranges of temperature and density, and evaluate the total Gibbs free energy of each phase. The phase boundaries are determined by the intersection of these free energy surfaces. Not unexpectedly for a classical pair potential, our results reveal quantitative discrepancies between the locations of the BKS and real silica phase boundaries. At the same time, we find that the topology of the real phase diagram is reproduced, confirming that the BKS model provides a satisfactory qualitative description of a silicalike material. We also compare the phase boundaries with the locations of liquid-state thermodynamic anomalies identified in previous studies of the BKS model.

14. Computer simulation of surface and film processes

NASA Technical Reports Server (NTRS)

Tiller, W. A.

1981-01-01

A molecular dynamics technique based upon Lennard-Jones type pair interactions is used to investigate time-dependent as well as equilibrium properties. The case study deals with systems containing Si and O atoms. In this case a more involved potential energy function (PEF) is employed and the system is simulated via a Monte-Carlo procedure. This furnishes the equilibrium properties of the system at its interfaces and surfaces as well as in the bulk.

15. A Computer Simulation of Braitenberg Vehicles

DTIC Science & Technology

1991-03-01

and that have the ability to adapt their behavior , using a learning algorithm developed by Teuvo Kohonen. The vehicle designer is free to select...learning algorithm, adapting behavior to improve food finding-performance. The initial evaluations failed to provide convincing proof that the simple...m m m | m | l | m i Preface The purpose of this effort was to simulate simple, biological learning behavior using an artificial neural network to

16. Computer Simulation of Shipboard Electrical Distribution Systems

DTIC Science & Technology

1989-06-01

variable. If used properly, the Euler Backward method for integrating differential equations approaches the same solution. Fast modes can also be...synchronous machines as well as other elements of a power network. EMTP handles stiff systems by using the Euler Backward method for integration. In general...simulations - 29 - however, there are three methods that work well. The f’irst is the Euler Forward method which is considered an explicit technique since it

17. Computational Simulation of High Energy Density Plasmas

DTIC Science & Technology

2009-10-30

flow. NumerEx used MACH2 to simulate the flow using compressible, inviscid hydrodynamics with the SESAME equations of state . The depth of the...Figure 1 shows the liner state versus the radius of a collapsing 10 cm tall lithium liner driven by an RLC circuit model of Shiva Star. This work...the coaxial gun section, and Figure 4 shows the physical state of the plasma just prior to pinch. Figure 5 shows neutron yield reaching 1014 in this

18. Computer simulation of a geomagnetic substorm

NASA Technical Reports Server (NTRS)

Lyon, J. G.; Brecht, S. H.; Huba, J. D.; Fedder, J. A.; Palmadesso, P. J.

1981-01-01

A global two-dimensional simulation of a substormlike process occurring in earth's magnetosphere is presented. The results are consistent with an empirical substorm model - the neutral-line model. Specifically, the introduction of a southward interplanetary magnetic field forms an open magnetosphere. Subsequently, a substorm neutral line forms at about 15 earth radii or closer in the magnetotail, and plasma sheet thinning and plasma acceleration occur. Eventually the substorm neutral line moves tailward toward its presubstorm position.

19. Computer simulation of the NASA water vapor electrolysis reactor

NASA Technical Reports Server (NTRS)

Bloom, A. M.

1974-01-01

The water vapor electrolysis (WVE) reactor is a spacecraft waste reclamation system for extended-mission manned spacecraft. The WVE reactor's raw material is water, its product oxygen. A computer simulation of the WVE operational processes provided the data required for an optimal design of the WVE unit. The simulation process was implemented with the aid of a FORTRAN IV routine.

20. Effectiveness of an Endodontic Diagnosis Computer Simulation Program.

ERIC Educational Resources Information Center

Fouad, Ashraf F.; Burleson, Joseph A.

1997-01-01

Effectiveness of a computer simulation to teach endodontic diagnosis was assessed using three groups (n=34,32,24) of dental students. All were lectured on diagnosis, pathology, and radiographic interpretation. One group then used the simulation, another had a seminar on the same material, and the third group had no further instruction. Results…

1. The Design, Development, and Evaluation of an Evaluative Computer Simulation.

ERIC Educational Resources Information Center

Ehrlich, Lisa R.

This paper discusses evaluation design considerations for a computer based evaluation simulation developed at the University of Iowa College of Medicine in Cardiology to assess the diagnostic skills of primary care physicians and medical students. The simulation developed allows for the assessment of diagnostic skills of physicians in the…

2. Computer Simulation of Incomplete-Data Interpretation Exercise.

ERIC Educational Resources Information Center

Robertson, Douglas Frederick

1987-01-01

Described is a computer simulation that was used to help general education students enrolled in a large introductory geology course. The purpose of the simulation is to learn to interpret incomplete data. Students design a plan to collect bathymetric data for an area of the ocean. Procedures used by the students and instructor are included.…

3. Investigating the Effectiveness of Computer Simulations for Chemistry Learning

ERIC Educational Resources Information Center

Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan

2012-01-01

Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…

4. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

ERIC Educational Resources Information Center

Magin, D. J.; Reizes, J. A.

1990-01-01

Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

5. Design Model for Learner-Centered, Computer-Based Simulations.

ERIC Educational Resources Information Center

Hawley, Chandra L.; Duffy, Thomas M.

This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

6. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

ERIC Educational Resources Information Center

Daley, Michael; Hillier, Douglas

1981-01-01

Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…

7. Simulation of Robot Kinematics Using Interactive Computer Graphics.

ERIC Educational Resources Information Center

Leu, M. C.; Mahajan, R.

1984-01-01

Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

8. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist.

PubMed

Drawert, Brian; Hellander, Andreas; Bales, Ben; Banerjee, Debjani; Bellesia, Giovanni; Daigle, Bernie J; Douglas, Geoffrey; Gu, Mengyuan; Gupta, Anand; Hellander, Stefan; Horuk, Chris; Nath, Dibyendu; Takkar, Aviral; Wu, Sheng; Lötstedt, Per; Krintz, Chandra; Petzold, Linda R

2016-12-01

We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.

9. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

PubMed Central

Banerjee, Debjani; Bellesia, Giovanni; Daigle, Bernie J.; Douglas, Geoffrey; Gu, Mengyuan; Gupta, Anand; Hellander, Stefan; Horuk, Chris; Nath, Dibyendu; Takkar, Aviral; Lötstedt, Per; Petzold, Linda R.

2016-01-01

We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity. PMID:27930676

10. Computer Simulation of Auxiliary Power Systems.

DTIC Science & Technology

1980-03-01

reverse side if necessary and iden~ffy by block number) gas turbine engine turbine engine computer programs auxiliary power unit aircraft engine starter ,i...printed to that effect . d. Turbines There are three choices for the turbine configuration, see Figure 2: 1) a one-stage turbine, 2) a two-stage turbine...07000 MAIN CO!RBUSTION EFF = .99500 DESIGN FUEL FLOW (LB/IHR) 150.00 MAIN COMB FUEL HEATING VALUE AT T4 FOR JP4 * 18400. COMB DISCHARGE TEMP

11. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

Smetana, Lara Kathleen; Bell, Randy L.

2012-06-01

Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

12. MIA computer simulation test results report. [space shuttle avionics

NASA Technical Reports Server (NTRS)

Unger, G. E.

1974-01-01

Results of the first noise susceptibility computer simulation tests of the complete MIA receiver analytical model are presented. Computer simulation tests were conducted with both Gaussian and pulse noise inputs. The results of the Gaussian noise tests were compared to results predicted previously and were found to be in substantial agreement. The results of the pulse noise tests will be compared to the results of planned analogous tests in the Data Bus Evaluation Laboratory at a later time. The MIA computer model is considered to be fully operational at this time.

13. Computer simulations for minds-on learning with ``Project Spectra!''

Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.

2010-12-01

How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.

14. Computer simulations of granular materials: the effects of mesoscopic forces

Kohring, G. A.

1994-12-01

The problem of the relatively small angles of repose reported by computer simulations of granular materials is discussed. It is shown that this problem can be partially understood as resulting from mesoscopic forces which are commonly neglected in the simulations. After including mesoscopic forces, characterized by the easily measurable surface energy, 2D computer simulations indicate that the angle of repose should increase as the size of the granular grains decreases, an effect not seen without mesoscopic forces. The exact magnitude of this effect depends upon the value of the surface energy and the coordination number of the granular pile.

15. Computer Models Simulate Fine Particle Dispersion

NASA Technical Reports Server (NTRS)

2010-01-01

Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

16. Use of computer graphics simulation for teaching of flexible sigmoidoscopy.

PubMed

Baillie, J; Jowell, P; Evangelou, H; Bickel, W; Cotton, P

1991-05-01

The concept of simulation training in endoscopy is now well-established. The systems currently under development employ either computer graphics simulation or interactive video technology; each has its strengths and weaknesses. A flexible sigmoidoscopy training device has been designed which uses graphic routines--such as object oriented programming and double buffering--in entirely new ways. These programming techniques compensate for the limitations of currently available desk-top microcomputers. By boosting existing computer 'horsepower' with next generation coprocessors and sophisticated graphics tools such as intensity interpolation (Gouraud shading), the realism of computer simulation of flexible sigmoidoscopy is being greatly enhanced. The computer program has teaching and scoring capabilities, making it a truly interactive system. Use has been made of this ability to record, grade and store each trainee encounter in computer memory as part of a multi-center, prospective trial of simulation training being conducted currently in the USA. A new input device, a dummy endoscope, has been designed that allows application of variable resistance to the insertion tube. This greatly enhances tactile feedback, such as resistance during looping. If carefully designed trials show that computer simulation is an attractive and effective training tool, it is expected that this technology will evolve rapidly and be made widely available to trainee endoscopists.

17. Micromechanics-Based Computational Simulation of Ceramic Matrix Composites

NASA Technical Reports Server (NTRS)

Murthy, Pappu L. N.; Mutal, Subodh K.; Duff, Dennis L. (Technical Monitor)

2003-01-01

Advanced high-temperature Ceramic Matrix Composites (CMC) hold an enormous potential for use in aerospace propulsion system components and certain land-based applications. However, being relatively new materials, a reliable design properties database of sufficient fidelity does not yet exist. To characterize these materials solely by testing is cost and time prohibitive. Computational simulation then becomes very useful to limit the experimental effort and reduce the design cycle time, Authors have been involved for over a decade in developing micromechanics- based computational simulation techniques (computer codes) to simulate all aspects of CMC behavior including quantification of scatter that these materials exhibit. A brief summary/capability of these computer codes with typical examples along with their use in design/analysis of certain structural components is the subject matter of this presentation.

18. Computational challenges in modeling and simulating living matter

Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.; de Castro, Maria Clicia Stelling

2016-12-01

Computational modeling has been successfully used to help scientists understand physical and biological phenomena. Recent technological advances allowthe simulation of larger systems, with greater accuracy. However, devising those systems requires new approaches and novel architectures, such as the use of parallel programming, so that the application can run in the new high performance environments, which are often computer clusters composed of different computation devices, as traditional CPUs, GPGPUs, Xeon Phis and even FPGAs. It is expected that scientists take advantage of the increasing computational power to model and simulate more complex structures and even merge different models into larger and more extensive ones. This paper aims at discussing the challenges of using those devices to simulate such complex systems.

19. Parallel computations using a cluster of workstations to simulate elasticity problems

Darmawan, J. B. B.; Mungkasi, S.

2016-11-01

Computational physics has played important roles in real world problems. This paper is within the applied computational physics area. The aim of this study is to observe the performance of parallel computations using a cluster of workstations (COW) to simulate elasticity problems. Parallel computations with the COW configuration are conducted using the Message Passing Interface (MPI) standard. In parallel computations with COW, we consider five scenarios with twenty simulations. In addition to the execution time, efficiency is used to evaluate programming algorithm scenarios. Sequential and parallel programming performances are evaluated based on their execution time and efficiency. Results show that the one-dimensional elasticity equations are not appropriate to be solved in parallel with MPI_Send and MPI_Recv technique in the MPI standard, because the total amount of time to exchange data is considered more dominant compared with the total amount of time to conduct the basic elasticity computation.

20. A demonstrative model of a lunar base simulation on a personal computer

NASA Technical Reports Server (NTRS)

1985-01-01

The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.

1. Positive Wigner functions render classical simulation of quantum computation efficient.

PubMed

Mari, A; Eisert, J

2012-12-07

We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

2. A heterogeneous computing environment for simulating astrophysical fluid flows

NASA Technical Reports Server (NTRS)

Cazes, J.

1994-01-01

In the Concurrent Computing Laboratory in the Department of Physics and Astronomy at Louisiana State University we have constructed a heterogeneous computing environment that permits us to routinely simulate complicated three-dimensional fluid flows and to readily visualize the results of each simulation via three-dimensional animation sequences. An 8192-node MasPar MP-1 computer with 0.5 GBytes of RAM provides 250 MFlops of execution speed for our fluid flow simulations. Utilizing the parallel virtual machine (PVM) language, at periodic intervals data is automatically transferred from the MP-1 to a cluster of workstations where individual three-dimensional images are rendered for inclusion in a single animation sequence. Work is underway to replace executions on the MP-1 with simulations performed on the 512-node CM-5 at NCSA and to simultaneously gain access to more potent volume rendering workstations.

3. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

2011-09-01

Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

4. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

PubMed

Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

2011-09-07

Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

5. Computation simulation of the nonlinear response of suspension bridges

SciTech Connect

McCallen, D.B.; Astaneh-Asl, A.

1997-10-01

Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.

6. The computer scene generation for star simulator hardware-in-the-loop simulation

Zhang, Ying; Yu, Hong; Du, Huijie; Lei, Jie

2011-08-01

The star sensor simulation system is used to test the star sensor performance on the ground, which is designed for star identification and spacecraft attitude determination of the spacecraft. The computer star scene based on the astronomical star chat is generated for hardware-in-the-loop simulation of the star sensor simulation system using by OpenGL.

7. Computational Simulations and the Scientific Method

NASA Technical Reports Server (NTRS)

Kleb, Bil; Wood, Bill

2005-01-01

As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

8. Osmosis : a molecular dynamics computer simulation study

Lion, Thomas

Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..

9. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

SciTech Connect

Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Mike Maguire; Adel Sarofim; Changguan Yang; Hong-Shig Shim

2004-01-28

This is the thirteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused on a preliminary detailed software design for the enhanced framework. Given the complexity of the individual software tools from each team (i.e., Reaction Engineering International, Carnegie Mellon University, Iowa State University), a robust, extensible design is required for the success of the project. In addition to achieving a preliminary software design, significant progress has been made on several development tasks for the program. These include: (1) the enhancement of the controller user interface to support detachment from the Computational Engine and support for multiple computer platforms, (2) modification of the Iowa State University interface-to-kernel communication mechanisms to meet the requirements of the new software design, (3) decoupling of the Carnegie Mellon University computational models from their parent IECM (Integrated Environmental Control Model) user interface for integration with the new framework and (4) development of a new CORBA-based model interfacing specification. A benchmarking exercise to compare process and CFD based models for entrained flow gasifiers was completed. A summary of our work on intrinsic kinetics for modeling coal gasification has been completed. Plans for implementing soot and tar models into our entrained flow gasifier models are outlined. Plans for implementing a model for mercury capture based on conventional capture technology, but applied to an IGCC system, are outlined.

10. Assessment methodology for computer-based instructional simulations.

PubMed

Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

2013-10-01

Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use.

11. Mapping an expanding territory: computer simulations in evolutionary biology.

PubMed

Huneman, Philippe

2014-08-01

The pervasive use of computer simulations in the sciences brings novel epistemological issues discussed in the philosophy of science literature since about a decade. Evolutionary biology strongly relies on such simulations, and in relation to it there exists a research program (Artificial Life) that mainly studies simulations themselves. This paper addresses the specificity of computer simulations in evolutionary biology, in the context (described in Sect. 1) of a set of questions about their scope as explanations, the nature of validation processes and the relation between simulations and true experiments or mathematical models. After making distinctions, especially between a weak use where simulations test hypotheses about the world, and a strong use where they allow one to explore sets of evolutionary dynamics not necessarily extant in our world, I argue in Sect. 2 that (weak) simulations are likely to represent in virtue of the fact that they instantiate specific features of causal processes that may be isomorphic to features of some causal processes in the world, though the latter are always intertwined with a myriad of different processes and hence unlikely to be directly manipulated and studied. I therefore argue that these simulations are merely able to provide candidate explanations for real patterns. Section 3 ends up by placing strong and weak simulations in Levins' triangle, that conceives of simulations as devices trying to fulfil one or two among three incompatible epistemic values (precision, realism, genericity).

12. Combining high performance simulation, data acquisition, and graphics display computers

NASA Technical Reports Server (NTRS)

Hickman, Robert J.

1989-01-01

Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

13. Computer simulation tests of optimized neutron powder diffractometer configurations

Cussen, L. D.; Lieutenant, K.

2016-06-01

Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

14. Computer Simulation of Metallo-Supramolecular Networks

Wang, Shihu; Chen, Chun-Chung; Dormidontova, Elena

2009-03-01

Using Monte Carlo simulation we studied formation of reversible metallo-supramolecular networks based on 3:1 ligand-metal complexes between end-functionalized oligomers and metal ions. The fraction of 1:1, 2:1 and 3:1 ligand-metal complexes was obtained and analyzed using an analytical approach as a function of oligomer concentration, c and metal-to-oligomer ratio. We found that at low concentration the maximum in the number-average molecular weight is achieved near the stoichiometric composition and it shifts to higher metal-to- oligomer ratios at larger concentrations. Predictions are made regarding the onset of network formation, which occurs in a limited range of metal-to-oligomer ratios at sufficiently large oligomer concentrations. The average molecular weight between effective crosslinks decreases with oligomer concentration and reaches its minimum at the stoichiometric composition, where the high-frequency elastic plateau modulus approaches its maximal value. At high oligomer concentrations the plateau modulus follows a c^1.8 concentration dependence, similar to recent experimental results for metallo-supramolecular networks.

15. Computer Simulation of Glioma Growth and Morphology

PubMed Central

Frieboes, Hermann B.; Lowengrub, John S.; Wise, S.; Zheng, X.; Macklin, Paul; Bearer, Elaine; Cristini, Vittorio

2007-01-01

Despite major advances in the study of glioma, the quantitative links between intra-tumor molecular/cellular properties, clinically observable properties such as morphology, and critical tumor behaviors such as growth and invasiveness remain unclear, hampering more effective coupling of tumor physical characteristics with implications for prognosis and therapy. Although molecular biology, histopathology, and radiological imaging are employed in this endeavor, studies are severely challenged by the multitude of different physical scales involved in tumor growth, i.e., from molecular nanoscale to cell microscale and finally to tissue centimeter scale. Consequently, it is often difficult to determine the underlying dynamics across dimensions. New techniques are needed to tackle these issues. Here, we address this multi-scalar problem by employing a novel predictive three-dimensional mathematical and computational model based on first-principle equations (conservation laws of physics) that describe mathematically the diffusion of cell substrates and other processes determining tumor mass growth and invasion. The model uses conserved variables to represent known determinants of glioma behavior, e.g., cell density and oxygen concentration, as well as biological functional relationships and parameters linking phenomena at different scales whose specific forms and values are hypothesized and calculated based on in-vitro and in-vivo experiments and from histopathology of tissue specimens from human gliomas. This model enables correlation of glioma morphology to tumor growth by quantifying interdependence of tumor mass on the microenvironment (e.g., hypoxia, tissue disruption) and on the cellular phenotypes (e.g., mitosis and apoptosis rates, cell adhesion strength). Once functional relationships between variables and associated parameter values have been informed, e.g. from histopathology or intra-operative analysis, this model can be used for disease diagnosis

16. Computer simulations of athermal and glassy systems

Xu, Ning

2005-12-01

We performed extensive molecular dynamics simulations to better understand athermal and glassy systems near jamming transitions. We focused on four related projects. In the first project, we decomposed the probability distribution P(φ) of finding a collectively jammed state at packing fraction φ into two distinct contributions: the density of CJ states rho(φ) and their basins of attraction beta(φ). In bidisperse systems, it is likely that rho(φ) controls the shape of P(φ) in the large system size limit, and thus the most likely random jammed state may be used as a protocol independent definition of random close packing in this system. In the second project, we measured the yield stress in two different ensembles: constant shear rate and constant stress. The yield stress measured in the constant stress ensemble is larger than that measured in the constant shear rate ensemble, however, the difference between these two measurements decreases with increasing system size. In the third project, we investigated under what circumstances nonlinear velocity profiles form in frictionless granular systems undergoing boundary driven planar shear flow. Nonlinear velocity profiles occur at short times, but evolve into linear profiles at long times. Nonlinear velocity profiles can be stabilized by vibrating these systems. The velocity profile can become highly localized when the shear stress of the system is below the constant force yield stress, provided that the granular temperature difference across the system is sufficiently large. In the fourth project, we measured the effective temperature defined from equilibrium fluctuation-dissipation relations in athermal and glassy systems sheared at constant pressure. We found that the effective temperature is strongly controlled by pressure in the slowly sheared regime. Thus, this effective temperature and pressure are not independent variables in this regime.

17. Trijunctions in crystalline materials: A computer simulation study

Srivilliputhur, Srinivasan Gopalan

The molecular dynamics (MD) method has been extensively used to gain atomistic insight into material properties. Massively parallel computers offer a cost-effective way to dramatically increase the scope and accuracy of MD. Keeping this emerging scenario in mind, we set and realized a two fold goal, (i) to develop an efficient large-scale parallel MD code for atoms interacting via short-range forces, and (ii) to apply our MD method to study the structure and energetics of trijunctions (TJ) in a FCC polycrystal. Using our parallel MD code, we performed atomistic simulations of a three dimensional, periodic Lennard-Jones polycrystalline system and found that the TJ line energies can have a negative value, in agreement with the suggestion of J. W. Gibbs. Our system consisted of three FCC grains rotated 30 degrees about a common <001> axis. This configuration yields six TJ's also along <001>, with symmetries m, 3 and 3m in the color group terminology of Cahn and Kalonji. Associated with these TJ's are three 30sp° and six 60sp° symmetric tilt grain boundaries (STGB). An aluminum specimen with such grains, and highly symmetric TJ's was epitaxially grown on a silicon (111) surface and studied using high-resolution electron-microscopy by Dahmen et al. For the first time it has been possible to simultaneously and unambiguously separate the TJ and STGB contribution to the system's excess energy, in addition to gaining insight into its atomic structure. Interfaces in our energy minimized polycrystal systems were compared with similar STGB's in pure bi-crystals. The polycrystal STGB width was found to be practically the same in all the system sizes investigated. As indicated by the systematic common-neighbor local structure analysis, and the radial and energy distribution functions, there exists a high level of interfacial order and slightly lower overall densities in the TJ systems (compared to a FCC crystal). Further, as indicated by these and the atom centered hydrostatic

18. A computer simulation of aircraft evacuation with fire

NASA Technical Reports Server (NTRS)

Middleton, V. E.

1983-01-01

A computer simulation was developed to assess passenger survival during the post-crash evacuation of a transport category aircraft when fire is a major threat. The computer code, FIREVAC, computes individual passenger exit paths and times to exit, taking into account delays and congestion caused by the interaction among the passengers and changing cabin conditions. Simple models for the physiological effects of the toxic cabin atmosphere are included with provision for including more sophisticated models as they become available. Both wide-body and standard-body aircraft may be simulated. Passenger characteristics are assigned stochastically from experimentally derived distributions. Results of simulations of evacuation trials and hypothetical evacuations under fire conditions are presented.

19. Computational simulation of drug delivery at molecular level.

PubMed

Li, Youyong; Hou, Tingjun

2010-01-01

The field of drug delivery is advancing rapidly. By controlling the precise level and/or location of a given drug in the body, side effects are reduced, doses are lowered, and new therapies are possible. Nonetheless, substantial challenges remain for delivering specific drugs into specific cells. Computational methods to predict the binding and dynamics between drug molecule and its carrier are increasingly desirable to minimize the investment in drug design and development. Significant progress in computational simulation is making it possible to understand the mechanism of drug delivery. This review summarizes the computational methods and progress of four categories of drug delivery systems: dendrimers, polymer micelle, liposome and carbon nanotubes. Computational simulations are particularly valuable in designing better drug carriers and addressing issues that are difficult to be explored by laboratory experiments, such as diffusion, dynamics, etc.

20. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

SciTech Connect

McCoy, Michel; Archer, Bill; Hendrickson, Bruce; Wade, Doug; Hoang, Thuc

2016-08-29

The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

1. KU-Band rendezvous radar performance computer simulation model

NASA Technical Reports Server (NTRS)

Griffin, J. W.

1980-01-01

The preparation of a real time computer simulation model of the KU band rendezvous radar to be integrated into the shuttle mission simulator (SMS), the shuttle engineering simulator (SES), and the shuttle avionics integration laboratory (SAIL) simulator is described. To meet crew training requirements a radar tracking performance model, and a target modeling method were developed. The parent simulation/radar simulation interface requirements, and the method selected to model target scattering properties, including an application of this method to the SPAS spacecraft are described. The radar search and acquisition mode performance model and the radar track mode signal processor model are examined and analyzed. The angle, angle rate, range, and range rate tracking loops are also discussed.

2. Towards accurate quantum simulations of large systems with small computers

Yang, Yonggang

2017-01-01

Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.

3. Towards accurate quantum simulations of large systems with small computers.

PubMed

Yang, Yonggang

2017-01-24

Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.

4. Towards accurate quantum simulations of large systems with small computers

PubMed Central

Yang, Yonggang

2017-01-01

Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems. PMID:28117366

5. Energy Efficient Biomolecular Simulations with FPGA-based Reconfigurable Computing

SciTech Connect

Hampton, Scott S; Agarwal, Pratul K

2010-05-01

Reconfigurable computing (RC) is being investigated as a hardware solution for improving time-to-solution for biomolecular simulations. A number of popular molecular dynamics (MD) codes are used to study various aspects of biomolecules. These codes are now capable of simulating nanosecond time-scale trajectories per day on conventional microprocessor-based hardware, but biomolecular processes often occur at the microsecond time-scale or longer. A wide gap exists between the desired and achievable simulation capability; therefore, there is considerable interest in alternative algorithms and hardware for improving the time-to-solution of MD codes. The fine-grain parallelism provided by Field Programmable Gate Arrays (FPGA) combined with their low power consumption make them an attractive solution for improving the performance of MD simulations. In this work, we use an FPGA-based coprocessor to accelerate the compute-intensive calculations of LAMMPS, a popular MD code, achieving up to 5.5 fold speed-up on the non-bonded force computations of the particle mesh Ewald method and up to 2.2 fold speed-up in overall time-to-solution, and potentially an increase by a factor of 9 in power-performance efficiencies for the pair-wise computations. The results presented here provide an example of the multi-faceted benefits to an application in a heterogeneous computing environment.

6. SSBN Tactical Security Exercise Simulator and Tactical Development Computer Program

DTIC Science & Technology

1990-05-08

physical dynamics. inc. *RES OPERA TIONSRE-R090 I IL’) FINAL REPORTU (N SSBN TACTICAL SECURITY EXERCISE SIMULATOR AND TACTICAL DEVELOPMENT COMPUTER...PROGRAM I CONTRACT #No0014-87-C-0063 I 8 MAY 1990 -A I SUBMITTED BY: PHYSICAL DYNAMICS, INC. RES OPERATIONS :. P. 0. BOX 9505 ARLINGTON, VA 22209 U...C A I RES-FR-009-90 I I I * FINAL REPORT I SSBN TACTICAL SECURITY EXERCISE SIMULATOR AND TACTICAL DEVELOPMENT COMPUTER PROGRAM I "NTRACT

7. Environments for online maritime simulators with cloud computing capabilities

Raicu, Gabriel; Raicu, Alexandra

2016-12-01

This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

8. Method for simulating paint mixing on computer monitors

Carabott, Ferdinand; Lewis, Garth; Piehl, Simon

2002-06-01

Computer programs like Adobe Photoshop can generate a mixture of two 'computer' colors by using the Gradient control. However, the resulting colors diverge from the equivalent paint mixtures in both hue and value. This study examines why programs like Photoshop are unable to simulate paint or pigment mixtures, and offers a solution using Photoshops existing tools. The article discusses how a library of colors, simulating paint mixtures, is created from 13 artists' colors. The mixtures can be imported into Photoshop as a color swatch palette of 1248 colors and as 78 continuous or stepped gradient files, all accessed in a new software package, Chromafile.

9. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

PubMed

Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

2010-10-01

Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies.

10. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics

John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D.

2016-04-01

We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems.

11. [Computer simulated images of radiopharmaceutical distributions in anthropomorphic phantoms

SciTech Connect

Not Available

1991-05-17

We have constructed an anatomically correct human geometry, which can be used to store radioisotope concentrations in 51 various internal organs. Each organ is associated with an index number which references to its attenuating characteristics (composition and density). The initial development of Computer Simulated Images of Radiopharmaceuticals in Anthropomorphic Phantoms (CSIRDAP) over the first 3 years has been very successful. All components of the simulation have been coded, made operational and debugged.

12. LAWS simulation: Sampling strategies and wind computation algorithms

NASA Technical Reports Server (NTRS)

Emmitt, G. D. A.; Wood, S. A.; Houston, S. H.

1989-01-01

In general, work has continued on developing and evaluating algorithms designed to manage the Laser Atmospheric Wind Sounder (LAWS) lidar pulses and to compute the horizontal wind vectors from the line-of-sight (LOS) measurements. These efforts fall into three categories: Improvements to the shot management and multi-pair algorithms (SMA/MPA); observing system simulation experiments; and ground-based simulations of LAWS.

13. Computer simulations of ions in radio-frequency traps

NASA Technical Reports Server (NTRS)

Williams, A.; Prestage, J. D.; Maleki, L.; Djomehri, J.; Harabetian, E.

1990-01-01

The motion of ions in a trapped-ion frequency standard affects the stability of the standard. In order to study the motion and structures of large ion clouds in a radio-frequency (RF) trap, a computer simulation of the system that incorporates the effect of thermal excitation of the ions was developed. Results are presented from the simulation for cloud sizes up to 512 ions, emphasizing cloud structures in the low-temperature regime.

14. Sandia Laboratories hybrid computer and motion simulator facilities

SciTech Connect

Curry, W. H.; French, R. E.

1980-05-01

Hybrid computer and motion simulator facilities at Sandia National Laboratories include an AD/FIVE-AD10-PDP11/60, an AD/FIVE-PDP11/45, an EAI7800-EAI640, an EAI580/TR48-Nova 800, and two Carco S-45OR-3/R-493A three-axis motion simulators. An EAI680 is used in the analog mode only. This report describes the current equipment.

15. A Computer Simulation of Community Pharmacy Practice for Educational Use

PubMed Central

Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert

2014-01-01

Objective. To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. Design. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Assessment. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Conclusion. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor. PMID:26056406

16. Computational Simulation of Equivalence Class Formation Using the go/no-go Procedure with Compound Stimuli.

PubMed

Vernucio, Renato Roberto; Debert, Paula

Research about equivalence has commonly utilized human participants as experimental subjects. More recently, computational models have been capable of reproducing performances observed in experiments with humans. The computational model often utilized is called RELNET, and it simulates training and testing trials of conditional relations using the matching-to-sample procedure (MTS). The differentiation between sample stimulus and comparison stimuli, indispensable in MTS, implies operational difficulties for simulations. For this reason, new studies seek to utilize alternative procedures to MTS, which do not differentiate the functions of the antecedent stimuli. This work evaluated the possibility of developing a new computational model to simulate equivalence class formation using the go/no-go procedure with compound stimuli. In Experiment 1, artificial neural networks were utilized to simulate training of the AB and BC relations as well as the testing of the AC relation. The results showed that four out of six runs demonstrated equivalence class formation. Experiment 2 evaluated whether the additional class training performed in Experiment 1, which was analogous to the simulation of pre-experimental experience of human participants, would be essential for simulating the establishment of equivalence classes. It was found that it was not possible to simulate equivalence class formation without the additional class training. Altogether, the experiments show that it is possible to simulate equivalence class formation using the go/no-go procedure with compound stimuli and that it is necessary to conduct additional class training. The model developed is, therefore, an alternative to RELNET for the study of equivalence relations using computational simulations.

17. The advanced computational testing and simulation toolkit (ACTS)

SciTech Connect

Drummond, L.A.; Marques, O.

2002-05-21

During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

18. Utilizing a Collaborative Cross Number Puzzle Game to Develop the Computing Ability of Addition and Subtraction

ERIC Educational Resources Information Center

Chen, Yen-Hua; Looi, Chee-Kit; Lin, Chiu-Pin; Shao, Yin-Juan; Chan, Tak-Wai

2012-01-01

While addition and subtraction is a key mathematical skill for young children, a typical activity for them in classrooms involves doing repetitive arithmetic calculation exercises. In this study, we explore a collaborative way for students to learn these skills in a technology-enabled way with wireless computers. Two classes, comprising a total of…

19. AKSATINT - SATELLITE INTERFERENCE ANALYSIS AND SIMULATION USING PERSONAL COMPUTERS

NASA Technical Reports Server (NTRS)

Kantak, A.

1994-01-01

In the late seventies, the number of communication satellites in service increased, and interference has become an increasingly important consideration in designing satellite/ground station communications systems. Satellite Interference Analysis and Simulation Using Personal Computers, AKSATINT, models the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both the desired and the interfering satellites are considered to be in elliptical orbits. The simulation contains computation of orbital positions of both satellites using classical orbital elements, calculation of the satellite antennae look angles for both satellites and elevation angles at the desired-satellite ground-station antenna, and computation of Doppler effect due to the motions of the satellites and the Earth's rotation. AKSATINT also computes the interference-tosignal-power ratio, taking into account losses suffered by the links. After computing the interference-to-signal-power ratio, the program computes the statistical quantities. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. The program includes a flowchart, a sample run, and results of that run. AKSATINT is expected to be of general use to system designers and frequency managers in selecting the proper frequency under an interference scenario. The AKSATINT program is written in BASIC. It was designed to operate on the IBM Personal Computer AT or compatibles, and has been implemented under MS DOS 3.2. AKSATINT was developed in 1987.

20. Dynamical localization simulated on a few-qubit quantum computer

SciTech Connect

Benenti, Giuliano; Montangero, Simone; Casati, Giulio; Shepelyansky, Dima L.

2003-05-01

We show that a quantum computer operating with a small number of qubits can simulate the dynamical localization of classical chaos in a system described by the quantum sawtooth map model. The dynamics of the system is computed efficiently up to a time t{>=}l, and then the localization length l can be obtained with accuracy {nu} by means of order 1/{nu}{sup 2} computer runs, followed by coarse-grained projective measurements on the computational basis. We also show that in the presence of static imperfections, a reliable computation of the localization length is possible without error correction up to an imperfection threshold which drops polynomially with the number of qubits.

1. Incorporation of shuttle CCT parameters in computer simulation models

NASA Technical Reports Server (NTRS)

Huntsberger, Terry

1990-01-01

Computer simulations of shuttle missions have become increasingly important during recent years. The complexity of mission planning for satellite launch and repair operations which usually involve EVA has led to the need for accurate visibility and access studies. The PLAID modeling package used in the Man-Systems Division at Johnson currently has the necessary capabilities for such studies. In addition, the modeling package is used for spatial location and orientation of shuttle components for film overlay studies such as the current investigation of the hydrogen leaks found in the shuttle flight. However, there are a number of differences between the simulation studies and actual mission viewing. These include image blur caused by the finite resolution of the CCT monitors in the shuttle and signal noise from the video tubes of the cameras. During the course of this investigation the shuttle CCT camera and monitor parameters are incorporated into the existing PLAID framework. These parameters are specific for certain camera/lens combinations and the SNR characteristics of these combinations are included in the noise models. The monitor resolution is incorporated using a Gaussian spread function such as that found in the screen phosphors in the shuttle monitors. Another difference between the traditional PLAID generated images and actual mission viewing lies in the lack of shadows and reflections of light from surfaces. Ray tracing of the scene explicitly includes the lighting and material characteristics of surfaces. The results of some preliminary studies using ray tracing techniques for the image generation process combined with the camera and monitor effects are also reported.

2. Multiscale approaches for simulation of nucleation, growth, and additive chemistry during electrochemical deposition of thin metal films

Stephens, Ryan Mark

Molecularly engineered deposition processes require computational algorithms that efficiently capture phenomena present at widely varying length and time scales. In this work, the island dynamics method was applied to simulation of kinetically-limited metal nucleation and growth by electrodeposition in the presence of additives. The model included additive kinetics, surface diffusion of adatoms, nucleation, and growth. The model was demonstrated for copper deposition in acid sulfate electrolyte containing [bis(3-sulfopropyl)disulfide], polyethylene glycol, and chloride. Simulation results were compared with kinetic Monte Carlo (KMC) calculations and found to be within 1% for fractional coverage values, and within 10% for nucleation density. The computational time was more than 10X faster than comparable KMC simulations over the range studied. The island dynamics algorithm was applied to the electrodeposition of a metal onto a substrate initially configured with an array of hemispherical seed clusters. It was found that the presence of chloride in the model additive system caused high densities of nuclei on the substrate surrounding the initial seed clusters, which led to the formation of a continuous thin metal film. Simulations carried out under low-chloride conditions resulted in the growth only of the initial seed clusters, without significant nucleation or thin film formation. Additional phenomena were explored by linking the molecular scale island dynamics algorithm to a continuum model that described the migration and diffusion in the diffusion layer near the electrode surface. The multiscale linkage allowed simulation of nucleation, growth, and additive chemistry under mass transport limited conditions, including the formation of nucleation exclusion zones surrounding growing nuclei. A two-step approach was used to calculate the spatial distribution of nucleation events on an electrode undergoing deposition by electrolysis under the influence of mass

3. A Computational Workbench Environment For Virtual Power Plant Simulation

SciTech Connect

Bockelie, Michael J.; Swensen, David A.; Denison, Martin K.; Sarofim, Adel F.

2001-11-06

In this paper we describe our progress toward creating a computational workbench for performing virtual simulations of Vision 21 power plants. The workbench provides a framework for incorporating a full complement of models, ranging from simple heat/mass balance reactor models that run in minutes to detailed models that can require several hours to execute. The workbench is being developed using the SCIRun software system. To leverage a broad range of visualization tools the OpenDX visualization package has been interfaced to the workbench. In Year One our efforts have focused on developing a prototype workbench for a conventional pulverized coal fired power plant. The prototype workbench uses a CFD model for the radiant furnace box and reactor models for downstream equipment. In Year Two and Year Three, the focus of the project will be on creating models for gasifier based systems and implementing these models into an improved workbench. In this paper we describe our work effort for Year One and outline our plans for future work. We discuss the models included in the prototype workbench and the software design issues that have been addressed to incorporate such a diverse range of models into a single software environment. In addition, we highlight our plans for developing the energyplex based workbench that will be developed in Year Two and Year Three.

4. Computer simulation of multigrid body dynamics and control

NASA Technical Reports Server (NTRS)

Swaminadham, M.; Moon, Young I.; Venkayya, V. B.

1990-01-01

The objective is to set up and analyze benchmark problems on multibody dynamics and to verify the predictions of two multibody computer simulation codes. TREETOPS and DISCOS have been used to run three example problems - one degree-of-freedom spring mass dashpot system, an inverted pendulum system, and a triple pendulum. To study the dynamics and control interaction, an inverted planar pendulum with an external body force and a torsional control spring was modeled as a hinge connected two-rigid body system. TREETOPS and DISCOS affected the time history simulation of this problem. System state space variables and their time derivatives from two simulation codes were compared.

5. Computer simulation of plasma and N-body problems

NASA Technical Reports Server (NTRS)

Harries, W. L.; Miller, J. B.

1975-01-01

The following FORTRAN language computer codes are presented: (1) efficient two- and three-dimensional central force potential solvers; (2) a three-dimensional simulator of an isolated galaxy which incorporates the potential solver; (3) a two-dimensional particle-in-cell simulator of the Jeans instability in an infinite self-gravitating compressible gas; and (4) a two-dimensional particle-in-cell simulator of a rotating self-gravitating compressible gaseous system of which rectangular coordinate and superior polar coordinate versions were written.

6. Multi-threaded, discrete event simulation of distributed computing systems

Legrand, Iosif; MONARC Collaboration

2001-10-01

The LHC experiments have envisaged computing systems of unprecedented complexity, for which is necessary to provide a realistic description and modeling of data access patterns, and of many jobs running concurrently on large scale distributed systems and exchanging very large amounts of data. A process oriented approach for discrete event simulation is well suited to describe various activities running concurrently, as well the stochastic arrival patterns specific for such type of simulation. Threaded objects or "Active Objects" can provide a natural way to map the specific behaviour of distributed data processing into the simulation program. The simulation tool developed within MONARC is based on Java (TM) technology which provides adequate tools for developing a flexible and distributed process oriented simulation. Proper graphics tools, and ways to analyze data interactively, are essential in any simulation project. The design elements, status and features of the MONARC simulation tool are presented. The program allows realistic modeling of complex data access patterns by multiple concurrent users in large scale computing systems in a wide range of possible architectures, from centralized to highly distributed. Comparison between queuing theory and realistic client-server measurements is also presented.

7. Decomposition strategies in the problems of simulation of additive laser technology processes

Khomenko, M. D.; Dubrov, A. V.; Mirzade, F. Kh.

2016-11-01

The development of additive technologies and their application in industry is associated with the possibility of predicting the final properties of a crystallized added material. This paper describes the problem characterized by a dynamic and spatially nonuniform computational complexity, which, in the case of uniform decomposition of a computational domain, leads to an unbalanced load on computational cores. The strategy of partitioning of the computational domain is used, which minimizes the CPU time losses in the serial computations of the additive technological process. The chosen strategy is optimal from the standpoint of a priori unknown dynamic computational load distribution. The scaling of the computational problem on the cluster of the Institute on Laser and Information Technologies (RAS) that uses the InfiniBand interconnect is determined. The use of the parallel code with optimal decomposition made it possible to significantly reduce the computational time (down to several hours), which is important in the context of development of the software package for support of engineering activity in the field of additive technology.

8. Teaching Physics (and Some Computation) Using Intentionally Incorrect Simulations

ERIC Educational Resources Information Center

Cox, Anne J.; Junkin, William F., III; Christian, Wolfgang; Belloni, Mario; Esquembre, Francisco

2011-01-01

Computer simulations are widely used in physics instruction because they can aid student visualization of abstract concepts, they can provide multiple representations of concepts (graphical, trajectories, charts), they can approximate real-world examples, and they can engage students interactively, all of which can enhance student understanding of…

9. Monte Carlo simulation by computer for life-cycle costing

NASA Technical Reports Server (NTRS)

Gralow, F. H.; Larson, W. J.

1969-01-01

Prediction of behavior and support requirements during the entire life cycle of a system enables accurate cost estimates by using the Monte Carlo simulation by computer. The system reduces the ultimate cost to the procuring agency because it takes into consideration the costs of initial procurement, operation, and maintenance.

10. Learner Perceptions of Realism and Magic in Computer Simulations.

ERIC Educational Resources Information Center

Hennessy, Sara; O'Shea, Tim

1993-01-01

Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive…

11. Effectiveness of Computer Simulation for Enhancing Higher Order Thinking.

ERIC Educational Resources Information Center

Gokhale, Anu A.

1996-01-01

Electronics students (16 controls, 16 experimentals) designed, built, and tested an amplifier. The experimentals did so after it was designed through computer simulation (using Electronics Workbench software). The experimental group performed significantly better on problem-solving tests; both groups did the same on drill and practice tests. (SK)

12. Advanced Simulation and Computing Co-Design Strategy

SciTech Connect

Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

2015-11-01

This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

13. A Computer Simulated Experiment in Complex Order Kinetics

ERIC Educational Resources Information Center

Merrill, J. C.; And Others

1975-01-01

Describes a computer simulation experiment in which physical chemistry students can determine all of the kinetic parameters of a reaction, such as order of the reaction with respect to each reagent, forward and reverse rate constants for the overall reaction, and forward and reverse activation energies. (MLH)

14. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

ERIC Educational Resources Information Center

Moore, Gwendolyn B.; And Others

The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…

15. Graphical Visualization on Computational Simulation Using Shared Memory

Lima, A. B.; Correa, Eberth

2014-03-01

The Shared Memory technique is a powerful tool for parallelizing computer codes. In particular it can be used to visualize the results "on the fly" without stop running the simulation. In this presentation we discuss and show how to use the technique conjugated with a visualization code using openGL.

16. Accelerating sino-atrium computer simulations with graphic processing units.

PubMed

Zhang, Hong; Xiao, Zheng; Lin, Shien-fong

2015-01-01

Sino-atrial node cells (SANCs) play a significant role in rhythmic firing. To investigate their role in arrhythmia and interactions with the atrium, computer simulations based on cellular dynamic mathematical models are generally used. However, the large-scale computation usually makes research difficult, given the limited computational power of Central Processing Units (CPUs). In this paper, an accelerating approach with Graphic Processing Units (GPUs) is proposed in a simulation consisting of the SAN tissue and the adjoining atrium. By using the operator splitting method, the computational task was made parallel. Three parallelization strategies were then put forward. The strategy with the shortest running time was further optimized by considering block size, data transfer and partition. The results showed that for a simulation with 500 SANCs and 30 atrial cells, the execution time taken by the non-optimized program decreased 62% with respect to a serial program running on CPU. The execution time decreased by 80% after the program was optimized. The larger the tissue was, the more significant the acceleration became. The results demonstrated the effectiveness of the proposed GPU-accelerating methods and their promising applications in more complicated biological simulations.

17. Time Advice and Learning Questions in Computer Simulations

ERIC Educational Resources Information Center

Rey, Gunter Daniel

2011-01-01

Students (N = 101) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without time advice) x 3 (with learning questions and corrective feedback, with…

18. Computer Simulation and Laboratory Work in the Teaching of Mechanics.

ERIC Educational Resources Information Center

Borghi, L.; And Others

1987-01-01

Describes a teaching strategy designed to help high school students learn mechanics by involving them in simple experimental work, observing didactic films, running computer simulations, and executing more complex laboratory experiments. Provides an example of the strategy as it is applied to the topic of projectile motion. (TW)

19. Computers With Wings: Flight Simulation and Personalized Landscapes

Oss, Stefano

2005-03-01

We propose, as a special way to explore the physics of flying objects, to use a flight simulator with a personalized scenery to reproduce the territory where students live. This approach increases the participation and attention of students to physics classes but also creates several opportunities for addressing side activities and arguments of various nature, from history to geography, computer science, and much more.

20. Improving a Computer Networks Course Using the Partov Simulation Engine

ERIC Educational Resources Information Center

Momeni, B.; Kharrazi, M.

2012-01-01

Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

1. Interview and Interrogation Training using a Computer-Simulated Subject.

ERIC Educational Resources Information Center

Olsen, Dale E.

Interactive, multimedia software involving a simulated subject has been created to help trainees develop interview and interrogation techniques using personal computers, because practice interviews are not always realistic and are too expensive. New and experienced law enforcement agents, among others, need such extensive training in techniques…

2. Biology Students Building Computer Simulations Using StarLogo TNG

ERIC Educational Resources Information Center

Smith, V. Anne; Duncan, Ishbel

2011-01-01

Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…

3. Computer simulations and neutron reflectivity of proteins at interfaces.

PubMed

Mungikar, Amol A; Forciniti, Daniel

2002-12-16

Computer simulations in conjunction with neutron reflectivity is an excellent combination for the study of biological materials at solid-liquid interfaces: Both techniques have excellent resolution levels (Angströms) and they are mature. A stronger interaction between physicists and biologists will allow the use of these two approaches in topics of biological-biomedical interest.

4. Social Choice in a Computer-Assisted Simulation

ERIC Educational Resources Information Center

Thavikulwat, Precha

2009-01-01

Pursuing a line of inquiry suggested by Crookall, Martin, Saunders, and Coote, the author applied, within the framework of design science, an optimal-design approach to incorporate into a computer-assisted simulation two innovative social choice processes: the multiple period double auction and continuous voting. Expectations that the…

5. Computational Simulation of a Water-Cooled Heat Pump

NASA Technical Reports Server (NTRS)

Bozarth, Duane

2008-01-01

A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).

6. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

ERIC Educational Resources Information Center

Moore, Gwendolyn B.; And Others

1986-01-01

Describes possible applications of new technologies to special education. Discusses results of a study designed to explore the use of robotics, artificial intelligence, and computer simulations to aid people with handicapping conditions. Presents several scenarios in which specific technological advances may contribute to special education…

7. Computational Modelling and Simulation Fostering New Approaches in Learning Probability

ERIC Educational Resources Information Center

Kuhn, Markus; Hoppe, Ulrich; Lingnau, Andreas; Wichmann, Astrid

2006-01-01

Discovery learning in mathematics in the domain of probability based on hands-on experiments is normally limited because of the difficulty in providing sufficient materials and data volume in terms of repetitions of the experiments. Our cooperative, computational modelling and simulation environment engages students and teachers in composing and…

8. Computational simulation of periodic thermal processes in the roof deck

Stastnik, S.

2016-06-01

The climate changes in the Central Europe highlight the importance of protection of buildings against overheating during summer season, with the crucial thermal insulation and accumulation role of the roof deck. This paper demonstrates the possibility of computational simulation of such periodic thermal processes, applying the evaluation of thermal attenuation using complex arithmetics, in confrontation with real experimental data.

ERIC Educational Resources Information Center

Rey, Gunter Daniel

2010-01-01

Undergraduate students (N = 97) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without instructional advice) x 2 (with or without time advice) x 2…

10. Comparison of Computer-Simulated Conventional and Branching Tests.

ERIC Educational Resources Information Center

Waters, Carrie W.

As part of a continuing research program to improve screening of enlisted men, this study was undertaken to compare a variety of computer-simulated conventional and branching tests and to extend the theoretical analysis of branching techniques. The degree of coordination between test scores and underlying ability is used to compare conventional…

11. Computer Simulation of Small Group Decisions: Model Three.

ERIC Educational Resources Information Center

Hare, A.P.; Scheiblechner, Hartmann

In a test of three computer models to simulate group decisions, data were used from 31 American and Austrian groups on a total of 307 trials. The task for each group was to predict a series of answers of an unknown subject on a value-orientation questionnaire, after being given a sample of his typical responses. The first model, used the mean of…

12. Simulations Using a Computer/Videodisc System: Instructional Design Considerations.

ERIC Educational Resources Information Center

Ehrlich, Lisa R.

Instructional design considerations involved in using level four videodisc systems when designing simulations are explored. Discussion of the hardware and software system characteristics notes that computer based training offers the features of text, graphics, color, animation, and highlighting techniques, while a videodisc player offers all of…

13. Interval sampling methods and measurement error: a computer simulation.

PubMed

Wirth, Oliver; Slaven, James; Taylor, Matthew A

2014-01-01

A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments.

14. Symbolic Quantum Computation Simulation in SymPy

Cugini, Addison; Curry, Matt; Granger, Brian

2010-10-01

Quantum computing is an emerging field which aims to use quantum mechanics to solve difficult computational problems with greater efficiency than on a classical computer. There is a need to create software that i) helps newcomers to learn the field, ii) enables practitioners to design and simulate quantum circuits and iii) provides an open foundation for further research in the field. Towards these ends we have created a package, in the open-source symbolic computation library SymPy, that simulates the quantum circuit model of quantum computation using Dirac notation. This framework builds on the extant powerful symbolic capabilities of SymPy to preform its simulations in a fully symbolic manner. We use object oriented design to abstract circuits as ordered collections of quantum gate and qbit objects. The gate objects can either be applied directly to the qbit objects or be represented as matrices in different bases. The package is also capable of performing the quantum Fourier transform and Shor's algorithm. A notion of measurement is made possible through the use of a non-commutative gate object. In this talk, we describe the software and show examples of quantum circuits on single and multi qbit states that involve common algorithms, gates and measurements.

15. A Computational Framework for Efficient Low Temperature Plasma Simulations

Verma, Abhishek Kumar; Venkattraman, Ayyaswamy

2016-10-01

Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.

16. Bibliography for Verification and Validation in Computational Simulations

SciTech Connect

Oberkampf, W.L.

1998-10-01

A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

17. A decade of computer simulations for Space Shuttle aerodynamics

NASA Technical Reports Server (NTRS)

Inouye, Mamoru

1988-01-01

Ten years ago computer simulations of the flow field around the Space Shuttle Orbiter were limited to inviscid calculations for the windward side of the forebody and viscous calculations for selected 2-D problems. Advances in computer hardware and numerical methods during the past ten years made it possible to calculate viscous flow over the complete orbiter configuration at angle of attack. The equations solved are the Reynolds-averaged, Navier-Stokes equation, simplified by either the thin-layer or parabolized approximation. An algebraic eddy viscosity model is used for turbulent flow. The free stream is assumed to be a perfect gas for wind tunnel conditions and a real gas in thermodynamic equilibrium for flight conditions. Four examples of recent computer simulations are presented. Flow field results include oil flow patterns on the surface and Mach number contours, isobars, and cross-flow velocity vectors in the shock layer.

18. A decade of computer simulations for space shuttle aerodynamics

NASA Technical Reports Server (NTRS)

Inouye, Mamoru

1988-01-01

Ten years ago computer simulations of the flow field around the Space Shuttle Orbiter were limited to inviscid calculations for the windward side of the forebody and viscous calculations for selected 2-D problems. Advances in computer hardware and numerical methods during the past ten years made it possible to calculate viscous flow over the complete orbiter configuration at angle of attack. The equations solved are the Reynolds-averaged, Navier-Stokes equation, simplified by either the thin-layer or parabolized approximation. An algebraic eddy viscosity model is used for turbulent flow. The free stream is assumed to be a perfect gas for wind tunnel conditions and a real gas in thermodynamic equilibrium for flight conditions. Four examples of recent computer simulations are presented. Flow field results include oil flow patterns on the surface and Mach number contours, isobars, and cross-flow velocity vectors in the shock layer.

19. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

ERIC Educational Resources Information Center

Bronson, Richard

1986-01-01

Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

20. Simulation of emission tomography using grid middleware for distributed computing.

PubMed

Thomason, M G; Longton, R F; Gregor, J; Smith, G T; Hutson, R K

2004-09-01

SimSET is Monte Carlo simulation software for emission tomography. This paper describes a simple but effective scheme for parallel execution of SimSET using NetSolve, a client-server system for distributed computation. NetSolve (version 1.4.1) is "grid middleware" which enables a user (the client) to run specific computations remotely and simultaneously on a grid of networked computers (the servers). Since the servers do not have to be identical machines, computation may take place in a heterogeneous environment. To take advantage of diversity in machines and their workloads, a client-side scheduler was implemented for the Monte Carlo simulation. The scheduler partitions the total decay events by taking into account the inherent compute-speeds and recent average workloads, i.e., the scheduler assigns more decay events to processors expected to give faster service and fewer decay events to those expected to give slower service. When compute-speeds and sustained workloads are taken into account, the speed-up is essentially linear in the number of equivalent "maximum-service" processors. One modification in the SimSET code (version 2.6.2.3) was made to ensure that the total number of decay events specified by the user is maintained in the distributed simulation. No other modifications in the standard SimSET code were made. Each processor runs complete SimSET code for its assignment of decay events, independently of others running simultaneously. Empirical results are reported for simulation of a clinical-quality lung perfusion study.

1. Multi-Rate Digital Control Systems with Simulation Applications. Volume II. Computer Algorithms

DTIC Science & Technology

1980-09-01

34 ~AFWAL-TR-80-31 01 • • Volume II L IL MULTI-RATE DIGITAL CONTROL SYSTEMS WITH SIMULATiON APPLICATIONS Volume II: Computer Algorithms DENNIS G. J...29 Ma -8 - Volume II. Computer Algorithms ~ / ’+ 44MWLxkQT N Uwe ~~ 4 ~jjskYIF336l5-79-C-369~ 9. PER~rORMING ORGANIZATION NAME AND ADDRESS IPROG AMEL...additional options. The analytical basis for the computer algorithms is discussed in Ref. 12. However, to provide a complete description of the program, some

2. Computer Simulation of Cellular Patterning Within the Drosophila Pupal Eye

PubMed Central

Swat, Maciej; Cordero, Julia B.; Glazier, James A.; Cagan, Ross L.

2010-01-01

We present a computer simulation and associated experimental validation of assembly of glial-like support cells into the interweaving hexagonal lattice that spans the Drosophila pupal eye. This process of cell movements organizes the ommatidial array into a functional pattern. Unlike earlier simulations that focused on the arrangements of cells within individual ommatidia, here we examine the local movements that lead to large-scale organization of the emerging eye field. Simulations based on our experimental observations of cell adhesion, cell death, and cell movement successfully patterned a tracing of an emerging wild-type pupal eye. Surprisingly, altering cell adhesion had only a mild effect on patterning, contradicting our previous hypothesis that the patterning was primarily the result of preferential adhesion between IRM-class surface proteins. Instead, our simulations highlighted the importance of programmed cell death (PCD) as well as a previously unappreciated variable: the expansion of cells' apical surface areas, which promoted rearrangement of neighboring cells. We tested this prediction experimentally by preventing expansion in the apical area of individual cells: patterning was disrupted in a manner predicted by our simulations. Our work demonstrates the value of combining computer simulation with in vivo experiments to uncover novel mechanisms that are perpetuated throughout the eye field. It also demonstrates the utility of the Glazier–Graner–Hogeweg model (GGH) for modeling the links between local cellular interactions and emergent properties of developing epithelia as well as predicting unanticipated results in vivo. PMID:20617161

3. An Efficient Multi-Scale Simulation Architecture for the Prediction of Performance Metrics of Parts Fabricated Using Additive Manufacturing

Pal, Deepankar; Patil, Nachiket; Zeng, Kai; Teng, Chong; Stucker, Brent

2015-09-01

In this study, an overview of the computational tools developed in the area of metal-based additively manufactured (AM) to simulate the performance metrics along with their experimental validations will be presented. The performance metrics of the AM fabricated parts such as the inter- and intra-layer strengths could be characterized in terms of the melt pool dimensions, solidification times, cooling rates, granular microstructure, and phase morphologies along with defect distributions which are a function of the energy source, scan pattern(s), and the material(s). The four major areas of AM simulation included in this study are thermo-mechanical constitutive relationships during fabrication and in- service, the use of Euler angles for gaging static and dynamic strengths, the use of algorithms involving intelligent use of matrix algebra and homogenization extracting the spatiotemporal nature of these processes, a fast GPU architecture, and specific challenges targeted toward attaining a faster than real-time simulation efficiency and accuracy.

4. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

Sharma, Gulshan B.; Robertson, Douglas D.

2013-07-01

Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual

5. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

SciTech Connect

Sharma, Gulshan B.; Robertson, Douglas D.

2013-07-01

Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

6. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

NASA Technical Reports Server (NTRS)

Seufzer, William J.

2014-01-01

Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

7. Computer-intensive simulation of solid-state NMR experiments using SIMPSON

Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr.; Vosegaard, Thomas

2014-09-01

Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations.

8. Computer-intensive simulation of solid-state NMR experiments using SIMPSON.

PubMed

Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas

2014-09-01

Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations.

9. Efficient simulation of open quantum system in duality quantum computing

Wei, Shi-Jie; Long, Gui-Lu

2016-11-01

Practical quantum systems are open systems due to interactions with their environment. Understanding the evolution of open systems dynamics is important for quantum noise processes , designing quantum error correcting codes, and performing simulations of open quantum systems. Here we proposed an efficient quantum algorithm for simulating the evolution of an open quantum system on a duality quantum computer. In contrast to unitary evolution in a usual quantum computer, the evolution operator in a duality quantum computer is a linear combination of unitary operators. In this duality algorithm, the time evolution of open quantum system is realized by using Kraus operators which is naturally realized in duality quantum computing. Compared to the Lloyd's quantum algorithm [Science.273, 1073(1996)] , the dependence on the dimension of the open quantum system in our algorithm is decreased. Moreover, our algorithm uses a truncated Taylor series of the evolution operators, exponentially improving the performance on the precision compared with existing quantum simulation algorithms with unitary evolution operations.

10. Computational simulation for analysis and synthesis of impact resilient structure

Djojodihardjo, Harijono

2013-10-01

Impact resilient structures are of great interest in many engineering applications varying from civil, land vehicle, aircraft and space structures, to mention a few examples. To design such structure, one has to resort fundamental principles and take into account progress in analytical and computational approaches as well as in material science and technology. With such perspectives, this work looks at a generic beam and plate structure subject to impact loading and carry out analysis and numerical simulation. The first objective of the work is to develop a computational algorithm to analyze flat plate as a generic structure subjected to impact loading for numerical simulation and parametric study. The analysis will be based on dynamic response analysis. Consideration is given to the elastic-plastic region. The second objective is to utilize the computational algorithm for direct numerical simulation, and as a parallel scheme, commercial off-the shelf numerical code is utilized for parametric study, optimization and synthesis. Through such analysis and numerical simulation, effort is devoted to arrive at an optimum configuration in terms of loading, structural dimensions, material properties and composite lay-up, among others. Results will be discussed in view of practical applications.

11. Cosmic reionization on computers. I. Design and calibration of simulations

SciTech Connect

Gnedin, Nickolay Y.

2014-09-20

Cosmic Reionization On Computers is a long-term program of numerical simulations of cosmic reionization. Its goal is to model fully self-consistently (albeit not necessarily from the first principles) all relevant physics, from radiative transfer to gas dynamics and star formation, in simulation volumes of up to 100 comoving Mpc, and with spatial resolution approaching 100 pc in physical units. In this method paper, we describe our numerical method, the design of simulations, and the calibration of numerical parameters. Using several sets (ensembles) of simulations in 20 h {sup –1} Mpc and 40 h {sup –1} Mpc boxes with spatial resolution reaching 125 pc at z = 6, we are able to match the observed galaxy UV luminosity functions at all redshifts between 6 and 10, as well as obtain reasonable agreement with the observational measurements of the Gunn-Peterson optical depth at z < 6.

12. 1988 Annual Summer Computer Simulation Conference, 20th, Seattle, WA, July 25-28, 1988, Proceedings

SciTech Connect

Barnett, C.C.; Holmes, W.M.

1988-01-01

The conference presents papers on simulation methods; computer systems; credibility and validation; simulation verification of computer performance; radar and communications systems simulation; physical, chemical, and engineering sciences; and parallel processing and concurrent simulation. Consideration is also given to biomedical simulation, energy and environmental sciences, training and research simulators, intelligent simulation environments, and simulation of discrete systems. Other topics include space and transportation systems, missile and aircraft systems simulation, and SDI.

13. Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design

NASA Technical Reports Server (NTRS)

Tweedt, Daniel L.

2010-01-01

Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.

14. Development of magnetron sputtering simulator with GPU parallel computing

Sohn, Ilyoup; Kim, Jihun; Bae, Junkyeong; Lee, Jinpil

2014-12-01

Sputtering devices are widely used in the semiconductor and display panel manufacturing process. Currently, a number of surface treatment applications using magnetron sputtering techniques are being used to improve the efficiency of the sputtering process, through the installation of magnets outside the vacuum chamber. Within the internal space of the low pressure chamber, plasma generated from the combination of a rarefied gas and an electric field is influenced interactively. Since the quality of the sputtering and deposition rate on the substrate is strongly dependent on the multi-physical phenomena of the plasma regime, numerical simulations using PIC-MCC (Particle In Cell, Monte Carlo Collision) should be employed to develop an efficient sputtering device. In this paper, the development of a magnetron sputtering simulator based on the PIC-MCC method and the associated numerical techniques are discussed. To solve the electric field equations in the 2-D Cartesian domain, a Poisson equation solver based on the FDM (Finite Differencing Method) is developed and coupled with the Monte Carlo Collision method to simulate the motion of gas particles influenced by an electric field. The magnetic field created from the permanent magnet installed outside the vacuum chamber is also numerically calculated using Biot-Savart's Law. All numerical methods employed in the present PIC code are validated by comparison with analytical and well-known commercial engineering software results, with all of the results showing good agreement. Finally, the developed PIC-MCC code is parallelized to be suitable for general purpose computing on graphics processing unit (GPGPU) acceleration, so as to reduce the large computation time which is generally required for particle simulations. The efficiency and accuracy of the GPGPU parallelized magnetron sputtering simulator are examined by comparison with the calculated results and computation times from the original serial code. It is found that

15. Performance predictions for solar-chemical convertors by computer simulation

Luttmer, J. D.; Trachtenberg, I.

1985-08-01

A computer model which simulates the operation of Texas Instruments solar-chemical convertor (SSC) was developed. The model allows optimization of SCC processes, material, and configuration by facilitating decisions on trade-offs among ease of manufacturing, power conversion efficiency, and cost effectiveness. The model includes various algorithms which define the electrical, electrochemical, and resistance parameters and which describe the operation of the discrete components of the SCC. Results of the model which depict the effect of material and geometric changes on various parameters are presented. The computer-calculated operation is compared with experimentally observed hydrobromic acid electrolysis rates.

16. Computational simulations and experimental validation of a furnace brazing process

SciTech Connect

Hosking, F.M.; Gianoulakis, S.E.; Malizia, L.A.

1998-12-31

Modeling of a furnace brazing process is described. The computational tools predict the thermal response of loaded hardware in a hydrogen brazing furnace to programmed furnace profiles. Experiments were conducted to validate the model and resolve computational uncertainties. Critical boundary conditions that affect materials and processing response to the furnace environment were determined. {open_quotes}Global{close_quotes} and local issues (i.e., at the furnace/hardware and joint levels, respectively) are discussed. The ability to accurately simulate and control furnace conditions is examined.

17. A distributed computing tool for generating neural simulation databases.

PubMed

Calin-Jageman, Robert J; Katz, Paul S

2006-12-01

After developing a model neuron or network, it is important to systematically explore its behavior across a wide range of parameter values or experimental conditions, or both. However, compiling a very large set of simulation runs is challenging because it typically requires both access to and expertise with high-performance computing facilities. To lower the barrier for large-scale model analysis, we have developed NeuronPM, a client/server application that creates a "screen-saver" cluster for running simulations in NEURON (Hines & Carnevale, 1997). NeuronPM provides a user-friendly way to use existing computing resources to catalog the performance of a neural simulation across a wide range of parameter values and experimental conditions. The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. During idle time, the client retrieves model files and work assignments from the server, invokes NEURON to run the simulation, and returns results to the server. Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net . It is a useful entry-level tool for systematically analyzing complex neuron and network simulations.

18. Definition and computation of intermolecular contact in liquids using additively weighted Voronoi tessellation.

PubMed

Isele-Holder, Rolf E; Rabideau, Brooks D; Ismail, Ahmed E

2012-05-10

We present a definition of intermolecular surface contact by applying weighted Voronoi tessellations to configurations of various organic liquids and water obtained from molecular dynamics simulations. This definition of surface contact is used to link the COSMO-RS model and molecular dynamics simulations. We demonstrate that additively weighted tessellation is the superior tessellation type to define intermolecular surface contact. Furthermore, we fit a set of weights for the elements C, H, O, N, F, and S for this tessellation type to obtain optimal agreement between the models. We use these radii to successfully predict contact statistics for compounds that were excluded from the fit and mixtures. The observed agreement between contact statistics from COSMO-RS and molecular dynamics simulations confirms the capability of the presented method to describe intermolecular contact. Furthermore, we observe that increasing polarity of the surfaces of the examined molecules leads to weaker agreement in the contact statistics. This is especially pronounced for pure water.

19. A computer simulation approach to measurement of human control strategy

NASA Technical Reports Server (NTRS)

Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

1982-01-01

Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

20. The very local Hubble flow: Computer simulations of dynamical history

Chernin, A. D.; Karachentsev, I. D.; Valtonen, M. J.; Dolgachev, V. P.; Domozhilova, L. M.; Makarov, D. I.

2004-02-01

The phenomenon of the very local (≤3 Mpc) Hubble flow is studied on the basis of the data of recent precision observations. A set of computer simulations is performed to trace the trajectories of the flow galaxies back in time to the epoch of the formation of the Local Group. It is found that the ``initial conditions'' of the flow are drastically different from the linear velocity-distance relation. The simulations enable one also to recognize the major trends of the flow evolution and identify the dynamical role of universal antigravity produced by the cosmic vacuum.

1. Methodology for characterizing modeling and discretization uncertainties in computational simulation

SciTech Connect

ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

2000-03-01

This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

2. Faulting parameters derived from computer simulation of earthquakes

NASA Technical Reports Server (NTRS)

Cohen, S. C.

1977-01-01

Seismic source parameters, average displacement, rupture length, and strain energy release are investigated by computer simulation using a coupled massive block model of the sliding along an active fault. Average displacements and energy release vary considerably with the degree of heterogeneity in the friction and elastic parameters used in the model. Strain energy release is determined primarily by the product of dynamic friction, rupture length, and average displacement. Interrelationships among the faulting parameters are consistent with theoretical arguments and experimental data. The variation in the frequency of occurrence of simulation events with strain energy release is different from the variation in the frequency of naturally occurring events with seismic energy.

3. The transesophageal echocardiography simulator based on computed tomography images.

PubMed

2013-02-01

Simulators are a new tool in education in many fields, including medicine, where they greatly improve familiarity with medical procedures, reduce costs, and, importantly, cause no harm to patients. This is so in the case of transesophageal echocardiography (TEE), in which the use of a simulator facilitates spatial orientation and helps in case studies. The aim of the project described in this paper is to simulate an examination by TEE. This research makes use of available computed tomography data to simulate the corresponding echocardiographic view. This paper describes the essential characteristics that distinguish these two modalities and the key principles of the wave phenomena that should be considered in the simulation process, taking into account the conditions specific to the echocardiography. The construction of the CT2TEE (Web-based TEE simulator) is also presented. The considerations include ray-tracing and ray-casting techniques in the context of ultrasound beam and artifact simulation. An important aspect of the interaction with the user is raised.

4. [Economic benefits of overlapping induction: investigation using a computer simulation model].

PubMed

Hunziker, S; Baumgart, A; Denz, C; Schüpfer, G

2009-06-01

The aim of this study was to investigate the potential economic benefit of overlapping anaesthesia induction given that all patient diagnosis-related groups (AP DRG) are used as the model for hospital reimbursement. A computer simulation model was used for this purpose. Due to the resource-intensive production process, the operating room (OR) environment is the most expensive part of the supply chain for surgical disciplines. The economical benefit of a parallel production process (additional personnel, adaptation of the process) as compared to a conventional serial layout was assessed. A computer-based simulation method was used with commercially available simulation software. Assumptions for revenues were made by reimbursement based on AP DRG. Based on a system analysis a model for the computer simulation was designed on a step-by-step abstraction process. In the model two operating rooms were used for parallel processing and two operating rooms for a serial production process. Six different types of surgical procedures based on historical case durations were investigated. The contribution margin was calculated based on the increased revenues minus the cost for the additional anaesthesia personnel. Over a period of 5 weeks 41 additional surgical cases were operated under the assumption of duration of surgery of 89+/-4 min (mean+/-SD). The additional contribution margin was CHF 104,588. In the case of longer surgical procedures with 103+/-25 min duration (mean+/-SD), an increase of 36 cases was possible in the same time period and the contribution margin was increased by CHF 384,836. When surgical cases with a mean procedural time of 243+/-55 min were simulated, 15 additional cases were possible. Therefore, the additional contribution margin was CHF 321,278. Although costs increased in this simulation when a serial production process was changed to a parallel system layout due to more personnel, an increase of the contribution margin was possible, especially with

5. Using Computer Simulation for Neurolab 2 Mission Planning

NASA Technical Reports Server (NTRS)

Sanders, Betty M.

1997-01-01

This paper presents an overview of the procedure used in the creation of a computer simulation video generated by the Graphics Research and Analysis Facility at NASA/Johnson Space Center. The simulation was preceded by an analysis of anthropometric characteristics of crew members and workspace requirements for 13 experiments to be conducted on Neurolab 2 which is dedicated to neuroscience and behavioral research. Neurolab 2 is being carried out as a partnership among national domestic research institutes and international space agencies. The video is a tour of the Spacelab module as it will be configured for STS-90, scheduled for launch in the spring of 1998, and identifies experiments that can be conducted in parallel during that mission. Therefore, this paper will also address methods for using computer modeling to facilitate the mission planning activity.

6. Development of computer simulations for landfill methane recovery

SciTech Connect

Massmann, J.W.; Moore, C.A.; Sykes, R.M.

1981-12-01

Two- and three-dimensional finite-difference computer programs simulating methane recovery systems in landfills have been developed. These computer programs model multicomponent combined pressure and diffusional flow in porous media. Each program and the processes it models are described in this report. Examples of the capabilities of each program are also presented. The two-dimensional program was used to simulate methane recovery systems in a cylindrically shaped landfill. The effects of various pump locations, geometries, and extraction rates were determined. The three-dimensional program was used to model the Puente Hills landfill, a field test site in southern California. The biochemical and microbiological details of methane generation in landfills are also given. Effects of environmental factors, such as moisture, oxygen, temperature, and nutrients on methane generation are discussed and an analytical representation of the gas generation rate is developed.

7. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

NASA Technical Reports Server (NTRS)

Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

2009-01-01

NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

8. How to simulate a universal quantum computer using negative probabilities

Hofmann, Holger F.

2009-07-01

The concept of negative probabilities can be used to decompose the interaction of two qubits mediated by a quantum controlled-NOT into three operations that require only classical interactions (that is, local operations and classical communication) between the qubits. For a single gate, the probabilities of the three operations are 1, 1 and -1. This decomposition can be applied in a probabilistic simulation of quantum computation by randomly choosing one of the three operations for each gate and assigning a negative statistical weight to the outcomes of sequences with an odd number of negative probability operations. The maximal exponential speed-up of a quantum computer can then be evaluated in terms of the increase in the number of sequences needed to simulate a single operation of the quantum circuit.

9. Allosteric mechanisms of nuclear receptors: insights from computational simulations.

PubMed

Mackinnon, Jonathan A G; Gallastegui, Nerea; Osguthorpe, David J; Hagler, Arnold T; Estébanez-Perpiñá, Eva

2014-08-05

The traditional structural view of allostery defines this key regulatory mechanism as the ability of one conformational event (allosteric site) to initiate another in a separate location (active site). In recent years computational simulations conducted to understand how this phenomenon occurs in nuclear receptors (NRs) has gained significant traction. These results have yield insights into allosteric changes and communication mechanisms that underpin ligand binding, coactivator binding site formation, post-translational modifications, and oncogenic mutations. Moreover, substantial efforts have been made in understanding the dynamic processes involved in ligand binding and coregulator recruitment to different NR conformations in order to predict cell/tissue-selective pharmacological outcomes of drugs. They also have improved the accuracy of in silico screening protocols so that nowadays they are becoming part of optimisation protocols for novel therapeutics. Here we summarise the important contributions that computational simulations have made towards understanding the structure/function relationships of NRs and how these can be exploited for rational drug design.

10. Para: a computer simulation code for plasma driven electromagnetic launchers

SciTech Connect

Thio, Y.-C.

1983-03-01

A computer code for simulation of rail-type accelerators utilizing a plasma armature has been developed and is described in detail. Some time varying properties of the plasma are taken into account in this code thus allowing the development of a dynamical model of the behavior of a plasma in a rail-type electromagnetic launcher. The code is being successfully used to predict and analyse experiments on small calibre rail-gun launchers.

11. Using Soft Computing Technologies for the Simulation of LCAC Dynamics

DTIC Science & Technology

2011-09-01

real-time, time-domain predictions of the vehicle’s dynamics as a function of the control signals given by the driver. Results are presented...free- running LCAC model, faster-than-real-time simulation, soft computing technology 1.0 INTRODUCTION The Maneuvering and Control Division (MCD...like all hovercraft , rides on a cushion of air. The air is supplied to the cushion by four centrifugal fans driven by the craft’s gas turbine

12. The cortex transform - Rapid computation of simulated neural images

NASA Technical Reports Server (NTRS)

Watson, Andrew B.

1987-01-01

With a goal of providing means for accelerating the image processing, machine vision, and testing of human vision models, an image transform was designed, which makes it possible to map an image into a set of images that vary in resolution and orientation. Each pixel in the output may be regarded as the simulated response of a neuron in human visual cortex. The transform is amenable to a number of shortcuts that greatly reduce the amount of computation.

13. Few-Body Problem: Theory and Computer Simulations

Flynn, Chris

A conference held in honour of the 60th birthday of Professor Mauri Valtonen in Turku, Finland, 4th-9th July 2005. The conference's major themes were the few-body problem in celestial mechanics and its numerical study; the theory of few-body escape; dynamics of multiple stars; computer simulations versus observations; planetary systems and few-body dynamics, and chaos in the few-body problem.

14. pV3-Gold Visualization Environment for Computer Simulations

NASA Technical Reports Server (NTRS)

Babrauckas, Theresa L.

1997-01-01

A new visualization environment, pV3-Gold, can be used during and after a computer simulation to extract and visualize the physical features in the results. This environment, which is an extension of the pV3 visualization environment developed at the Massachusetts Institute of Technology with guidance and support by researchers at the NASA Lewis Research Center, features many tools that allow users to display data in various ways.

15. Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes

ERIC Educational Resources Information Center

West, Jan; Veenstra, Anneke

2012-01-01

Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…

16. Scalable High Performance Computing: Direct and Large-Eddy Turbulent Flow Simulations Using Massively Parallel Computers

NASA Technical Reports Server (NTRS)

Morgan, Philip E.

2004-01-01

This final report contains reports of research related to the tasks "Scalable High Performance Computing: Direct and Lark-Eddy Turbulent FLow Simulations Using Massively Parallel Computers" and "Devleop High-Performance Time-Domain Computational Electromagnetics Capability for RCS Prediction, Wave Propagation in Dispersive Media, and Dual-Use Applications. The discussion of Scalable High Performance Computing reports on three objectives: validate, access scalability, and apply two parallel flow solvers for three-dimensional Navier-Stokes flows; develop and validate a high-order parallel solver for Direct Numerical Simulations (DNS) and Large Eddy Simulation (LES) problems; and Investigate and develop a high-order Reynolds averaged Navier-Stokes turbulence model. The discussion of High-Performance Time-Domain Computational Electromagnetics reports on five objectives: enhancement of an electromagnetics code (CHARGE) to be able to effectively model antenna problems; utilize lessons learned in high-order/spectral solution of swirling 3D jets to apply to solving electromagnetics project; transition a high-order fluids code, FDL3DI, to be able to solve Maxwell's Equations using compact-differencing; develop and demonstrate improved radiation absorbing boundary conditions for high-order CEM; and extend high-order CEM solver to address variable material properties. The report also contains a review of work done by the systems engineer.

17. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

SciTech Connect

William M. Bond; Salih Ersayin

2007-03-30

This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

18. An FPGA computing demo core for space charge simulation

SciTech Connect

Wu, Jinyuan; Huang, Yifei; /Fermilab

2009-01-01

In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

19. Numerical simulation of landfill aeration using computational fluid dynamics.

PubMed

Fytanidis, Dimitrios K; Voudrias, Evangelos A

2014-04-01

The present study is an application of Computational Fluid Dynamics (CFD) to the numerical simulation of landfill aeration systems. Specifically, the CFD algorithms provided by the commercial solver ANSYS Fluent 14.0, combined with an in-house source code developed to modify the main solver, were used. The unsaturated multiphase flow of air and liquid phases and the biochemical processes for aerobic biodegradation of the organic fraction of municipal solid waste were simulated taking into consideration their temporal and spatial evolution, as well as complex effects, such as oxygen mass transfer across phases, unsaturated flow effects (capillary suction and unsaturated hydraulic conductivity), temperature variations due to biochemical processes and environmental correction factors for the applied kinetics (Monod and 1st order kinetics). The developed model results were compared with literature experimental data. Also, pilot scale simulations and sensitivity analysis were implemented. Moreover, simulation results of a hypothetical single aeration well were shown, while its zone of influence was estimated using both the pressure and oxygen distribution. Finally, a case study was simulated for a hypothetical landfill aeration system. Both a static (steadily positive or negative relative pressure with time) and a hybrid (following a square wave pattern of positive and negative values of relative pressure with time) scenarios for the aeration wells were examined. The results showed that the present model is capable of simulating landfill aeration and the obtained results were in good agreement with corresponding previous experimental and numerical investigations.

20. Enabling Global Kinetic Simulations of the Magnetosphere via Petascale Computing

Karimabadi, H.; Vu, H. X.; Omelchenko, Y. A.; Tatineni, M.; Majumdar, A.; Catalyurek, U. V.; Saule, E.

2009-11-01

The ultimate goal in magnetospheric physics is to understand how the solar wind transfers its mass, momentum and energy to the magnetosphere. This problem has turned out to be much more complex intellectually than originally thought. MHD simulations have proven useful in predicting eminent features of substorms and other global events. Given the complexity of solar wind-magnetosphere interactions, hybrid (electron fluid, kinetic ion) simulations have recently been emerging in the studies of the global dynamics of the magnetosphere with the goal of accurately predicting the energetic particle transport and structure of plasma boundaries. We take advantage of our recent innovations in hybrid simulations and the power of massively parallel computers to make breakthrough 3D global kinetic simulations of the magnetosphere. The preliminary results reveal many major differences with global MHD simulations. For example, the hybrid simulations predict the formation of the quadruple structure associated with reconnection events, ion/ion kink instability in the tail, turbulence in the magnetosheath, and formation of the ion foreshock region.

1. Dynamic computer simulations of electrophoresis: three decades of active research.

PubMed

Thormann, Wolfgang; Caslavska, Jitka; Breadmore, Michael C; Mosher, Richard A

2009-06-01

Dynamic models for electrophoresis are based upon model equations derived from the transport concepts in solution together with user-inputted conditions. They are able to predict theoretically the movement of ions and are as such the most versatile tool to explore the fundamentals of electrokinetic separations. Since its inception three decades ago, the state of dynamic computer simulation software and its use has progressed significantly and Electrophoresis played a pivotal role in that endeavor as a large proportion of the fundamental and application papers were published in this periodical. Software is available that simulates all basic electrophoretic systems, including moving boundary electrophoresis, zone electrophoresis, ITP, IEF and EKC, and their combinations under almost exactly the same conditions used in the laboratory. This has been employed to show the detailed mechanisms of many of the fundamental phenomena that occur in electrophoretic separations. Dynamic electrophoretic simulations are relevant for separations on any scale and instrumental format, including free-fluid preparative, gel, capillary and chip electrophoresis. This review includes a historical overview, a survey of current simulators, simulation examples and a discussion of the applications and achievements of dynamic simulation.

2. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

NASA Technical Reports Server (NTRS)

Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

1989-01-01

The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

3. Lightweight computational steering of very large scale molecular dynamics simulations

SciTech Connect

Beazley, D.M.; Lomdahl, P.S.

1996-09-01

We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.

4. Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models

SciTech Connect

Cook, Christopher B.; Richmond, Marshall C.

2001-05-01

This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.

5. Computer simulation of surface modification with ion beams

Insepov, Z.; Hassanein, A.; Swenson, D.; Terasawa, M.

2005-12-01

Interactions of energetic ions with various solid targets including silicon and a few metal surfaces were studied by computer simulation and verified by experiment. Surface sputtering and modification for collisions of Arn (n ∼ 100) cluster ions, with kinetic energies of 12-54 eV/atom, and slow highly charged ions (HCI), with potential energies of 80-3500 eV, have been simulated. Various energy transfer mechanisms of the ion energy into the solid target, such as shock wave generation, hollow atom formation, Coulomb explosion, charge screening and neutralization were studied. Atomistic molecular dynamics (MD), as well as a phenomenological surface dynamics methods were employed and the results of the simulations were compared with the experimental data.

6. Computer simulations of the mechanical properties of metals.

PubMed

Schiøtz, J; Vegge, T

1999-01-01

Atomic-scale computer simulations can be used to gain a better understanding of the mechanical properties of materials. In this paper we demonstrate how this can be done in the case of nanocrystalline copper, and give a brief overview of how simulations may be extended to larger length scales. Nanocrystline metals are metals with grain sizes in the nanometre range, they have a number of technologically interesting properties such as much increased hardness and yield strength. Our simulations show that the deformation mechanisms are different in these materials than in coarse-grained materials. The main deformation is occurring in the grain boundaries, and only little dislocation activity is seen inside the grains. This leads to a hardening of the material as the grain size is increased, and the volume fraction of grain boundaries is decreased.

7. Feasibility Study of Computational Fluid Dynamics Simulation of Coronary Computed Tomography Angiography Based on Dual-Source Computed Tomography

PubMed Central

Lu, Jing; Yu, Jie; Shi, Heshui

2017-01-01

Background Adding functional features to morphological features offers a new method for non-invasive assessment of myocardial perfusion. This study aimed to explore technical routes of assessing the left coronary artery pressure gradient, wall shear stress distribution and blood flow velocity distribution, combining three-dimensional coronary model which was based on high resolution dual-source computed tomography (CT) with computational fluid dynamics (CFD) simulation. Methods Three cases of no obvious stenosis, mild stenosis and severe stenosis in left anterior descending (LAD) were enrolled. Images acquired on dual-source CT were input into software Mimics, ICEMCFD and FLUENT to simulate pressure gradient, wall shear stress distribution and blood flow velocity distribution. Measuring coronary enhancement ratio of coronary artery was to compare with pressure gradient. Results Results conformed to theoretical values and showed difference between normal and abnormal samples. Conclusions The study verified essential parameters and basic techniques in blood flow numerical simulation preliminarily. It was proved feasible. PMID:27924174

8. Computer assisted preoperative planning of bone fracture reduction: Simulation techniques and new trends.

PubMed

Jiménez-Delgado, Juan J; Paulano-Godino, Félix; PulidoRam-Ramírez, Rubén; Jiménez-Pérez, J Roberto

2016-05-01

The development of support systems for surgery significantly increases the likelihood of obtaining satisfactory results. In the case of fracture reduction interventions these systems enable surgery planning, training, monitoring and assessment. They allow improvement of fracture stabilization, a minimizing of health risks and a reduction of surgery time. Planning a bone fracture reduction by means of a computer assisted simulation involves several semiautomatic or automatic steps. The simulation deals with the correct position of osseous fragments and fixation devices for a fracture reduction. Currently, to the best of our knowledge there is no computer assisted methods to plan an entire fracture reduction process. This paper presents an overall scheme of the computer based process for planning a bone fracture reduction, as described above, and details its main steps, the most common proposed techniques and their main shortcomings. In addition, challenges and new trends of this research field are depicted and analyzed.

9. Average-Case Complexity Versus Approximate Simulation of Commuting Quantum Computations

Bremner, Michael J.; Montanaro, Ashley; Shepherd, Dan J.

2016-08-01

We use the class of commuting quantum computations known as IQP (instantaneous quantum polynomial time) to strengthen the conjecture that quantum computers are hard to simulate classically. We show that, if either of two plausible average-case hardness conjectures holds, then IQP computations are hard to simulate classically up to constant additive error. One conjecture relates to the hardness of estimating the complex-temperature partition function for random instances of the Ising model; the other concerns approximating the number of zeroes of random low-degree polynomials. We observe that both conjectures can be shown to be valid in the setting of worst-case complexity. We arrive at these conjectures by deriving spin-based generalizations of the boson sampling problem that avoid the so-called permanent anticoncentration conjecture.

10. Accelerating Climate and Weather Simulations through Hybrid Computing

NASA Technical Reports Server (NTRS)

Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

2011-01-01

Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

PubMed

Weaver, Ashley A; Stitzel, Sarah M; Stitzel, Joel D

2017-04-01

A predictive Lagrangian-Eulerian finite element eye model was used to analyze 2.27 and 0.45 kg trinitrotoluene equivalent blasts detonated from 24 different locations. Free air and ground level blasts were simulated directly in front of the eye and at lateral offset locations with box, average, less protective, and more protective orbital anthropometries, resulting in 96 simulations. Injury risk curves were developed for hyphema, lens dislocation, retinal damage, and globe rupture from experimental and computational data to compute risk from corneoscleral stress and intra-ocular pressure computational outputs. Corneoscleral stress, intra-ocular pressure, and injury risks increased when the blast size was larger and located nearer to the eye. Risks ranged from 20-100 % for hyphema, 1-100 % for lens dislocation, 2-100 % for retinal damage, and 0-98 % for globe rupture depending on the blast condition. Orbital geometry affected the stresses, pressures, and associated ocular injury risks of the blast conditions simulated. Orbital geometries that more fully surrounded the eye such as the more protective orbit tended to produce higher corneoscleral stresses and compression of the eye against the surrounding rigid orbit contributing to high stresses as the blast wave propagated. However, the more protective orbit tended to produce lower intra-ocular pressures in comparison with the other three orbital geometries which may indicate that the more protective orbit inhibits propagation of the blast wave and reduces ocular loading. Results of this parametric computational study of ocular blast loading are valuable to the design of eye protection equipment and the mitigation of blast-related eye injuries.

12. Explicit contact modeling for surgical computer guidance and simulation

Johnsen, S. F.; Taylor, Z. A.; Clarkson, M.; Thompson, S.; Hu, M.; Gurusamy, K.; Davidson, B.; Hawkes, D. J.; Ourselin, S.

2012-02-01

Realistic modelling of mechanical interactions between tissues is an important part of surgical simulation, and may become a valuable asset in surgical computer guidance. Unfortunately, it is also computationally very demanding. Explicit matrix-free FEM solvers have been shown to be a good choice for fast tissue simulation, however little work has been done on contact algorithms for such FEM solvers. This work introduces such an algorithm that is capable of handling both deformable-deformable (soft-tissue interacting with soft-tissue) and deformable-rigid (e.g. soft-tissue interacting with surgical instruments) contacts. The proposed algorithm employs responses computed with a fully matrix-free, virtual node-based version of the model first used by Taylor and Flanagan in PRONTO3D. For contact detection, a bounding-volume hierarchy (BVH) capable of identifying self collisions is introduced. The proposed BVH generation and update strategies comprise novel heuristics to minimise the number of bounding volumes visited in hierarchy update and collision detection. Aside from speed, stability was a major objective in the development of the algorithm, hence a novel method for computation of response forces from C0-continuous normals, and a gradual application of response forces from rate constraints has been devised and incorporated in the scheme. The continuity of the surface normals has advantages particularly in applications such as sliding over irregular surfaces, which occurs, e.g., in simulated breathing. The effectiveness of the scheme is demonstrated on a number of meshes derived from medical image data and artificial test cases.

13. Addition of flexible body option to the TOLA computer program, part 1

NASA Technical Reports Server (NTRS)

Dick, J. W.; Benda, B. J.

1975-01-01

This report describes a flexible body option that was developed and added to the Takeoff and Landing Analysis (TOLA) computer program. The addition of the flexible body option to TOLA allows it to be used to study essentially any conventional type airplane in the ground operating environment. It provides the capability to predict the total motion of selected points on the analytical methods incorporated in the program and operating instructions for the option are described. A program listing is included along with several example problems to aid in interpretation of the operating instructions and to illustrate program usage.

14. Computational simulation of materials notes for lectures given at UCSB, May 1996--June 1996

SciTech Connect

LeSar, R.

1997-01-01

This report presents information from a lecture given on the computational simulation of materials. The purpose is to introduce modern computerized simulation methods for materials properties and response.

15. An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation

NASA Technical Reports Server (NTRS)

Bartos, R. D.

1993-01-01

Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.

16. Fast computer simulation of reconstructed image from rainbow hologram based on GPU

Shuming, Jiao; Yoshikawa, Hiroshi

2015-10-01

A fast computer simulation solution for rainbow hologram reconstruction based on GPU is proposed. In the commonly used segment Fourier transform method for rainbow hologram reconstruction, the computation of 2D Fourier transform on each hologram segment is very time consuming. GPU-based parallel computing can be applied to improve the computing speed. Compared with CPU computing, simulation results indicate that our proposed GPU computing can effectively reduce the computation time by as much as eight times.

17. Further developments in cloud statistics for computer simulations

NASA Technical Reports Server (NTRS)

Chang, D. T.; Willand, J. H.

1972-01-01

This study is a part of NASA's continued program to provide global statistics of cloud parameters for computer simulation. The primary emphasis was on the development of the data bank of the global statistical distributions of cloud types and cloud layers and their applications in the simulation of the vertical distributions of in-cloud parameters such as liquid water content. These statistics were compiled from actual surface observations as recorded in Standard WBAN forms. Data for a total of 19 stations were obtained and reduced. These stations were selected to be representative of the 19 primary cloud climatological regions defined in previous studies of cloud statistics. Using the data compiled in this study, a limited study was conducted of the hemogeneity of cloud regions, the latitudinal dependence of cloud-type distributions, the dependence of these statistics on sample size, and other factors in the statistics which are of significance to the problem of simulation. The application of the statistics in cloud simulation was investigated. In particular, the inclusion of the new statistics in an expanded multi-step Monte Carlo simulation scheme is suggested and briefly outlined.

18. Design and evaluation of a computer controlled solar collector simulator

Kotas, J. F.; Wood, B. D.

1980-11-01

A computer-controlled system has been developed to simulate the thermal processes of a flat-plate solar collector. The simulator is based on four water heaters of capacities of 1.5, 2.5, 5.0 and 5.0 kW providing a maximum design output of 14.0 kW which are controlled by a Nova 3 minicomputer, which also monitors temperatures in the fluid stream. Measurements have been obtained of the steady-state operating values and time constants of the individual heaters at different flow rates in order to utilize effectively their thermal outputs. Software was designed to control the heater system so the total thermal output closely approximates that of an actual heater array, utilizing steady-state or dynamic control modes. Simulation of the heat output of a previously tested collector has resulted in simulated values differing from actual output by a maximum of 3% under identical operating conditions, thus indicating that the simulator represents a viable alternative to the testing of a large field of collectors.

19. Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation

NASA Technical Reports Server (NTRS)

Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.

2012-01-01

The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.

20. Written debriefing: Evaluating the impact of the addition of a written component when debriefing simulations.

PubMed

Reed, Shelly J

2015-11-01

Debriefing, the reflective period following a simulation, is said to be where the bulk of simulation learning takes place. Many expert opinions regarding debriefing exist, but evidence-based best practices have yet to be identified. Written debriefing is one of these practices; experts state learning can be extended through the addition of a written component to the debriefing process, but no evidence exists to support this. This study compares three debriefing types: discussion alone, and discussion followed by journaling or blogging. Undergraduate nursing students participating in a simulation were randomized as a simulation group to one of these three debriefing types. Following completion of debriefing activities, students completed a Debriefing Experience Scale, a tool designed to evaluate the student experience during debriefing. Data obtained from completed scales were analyzed with ANOVA followed by Fisher LSD post hoc testing. The results showed the students preferred their experience with discussion debriefing over discussion debriefing with a written component added.

1. Use of a Computer Simulation To Develop Mental Simulations for Understanding Relative Motion Concepts.

ERIC Educational Resources Information Center

Monaghan, James M.; Clement, John

1999-01-01

Presents evidence for students' qualitative and quantitative difficulties with apparently simple one-dimensional relative-motion problems, students' spontaneous visualization of relative-motion problems, the visualizations facilitating solution of these problems, and students' memories of the online computer simulation used as a framework for…

2. Computer simulation of methanol exchange dynamics around cations and anions

SciTech Connect

Roy, Santanu; Dang, Liem X.

2016-03-03

In this paper, we present the first computer simulation of methanol exchange dynamics between the first and second solvation shells around different cations and anions. After water, methanol is the most frequently used solvent for ions. Methanol has different structural and dynamical properties than water, so its ion solvation process is different. To this end, we performed molecular dynamics simulations using polarizable potential models to describe methanol-methanol and ion-methanol interactions. In particular, we computed methanol exchange rates by employing the transition state theory, the Impey-Madden-McDonald method, the reactive flux approach, and the Grote-Hynes theory. We observed that methanol exchange occurs at a nanosecond time scale for Na+ and at a picosecond time scale for other ions. We also observed a trend in which, for like charges, the exchange rate is slower for smaller ions because they are more strongly bound to methanol. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.

3. Numerical simulation of NQR/NMR: Applications in quantum computing.

PubMed

Possa, Denimar; Gaudio, Anderson C; Freitas, Jair C C

2011-04-01

A numerical simulation program able to simulate nuclear quadrupole resonance (NQR) as well as nuclear magnetic resonance (NMR) experiments is presented, written using the Mathematica package, aiming especially applications in quantum computing. The program makes use of the interaction picture to compute the effect of the relevant nuclear spin interactions, without any assumption about the relative size of each interaction. This makes the program flexible and versatile, being useful in a wide range of experimental situations, going from NQR (at zero or under small applied magnetic field) to high-field NMR experiments. Some conditions specifically required for quantum computing applications are implemented in the program, such as the possibility of use of elliptically polarized radiofrequency and the inclusion of first- and second-order terms in the average Hamiltonian expansion. A number of examples dealing with simple NQR and quadrupole-perturbed NMR experiments are presented, along with the proposal of experiments to create quantum pseudopure states and logic gates using NQR. The program and the various application examples are freely available through the link http://www.profanderson.net/files/nmr_nqr.php.

4. [Computer simulation programs as an alternative for classical nerve, muscle and heart experiments using frog tissues].

PubMed

Breves, G; Schröder, B

2000-03-01

Courses in Physiology include different methodical approaches such as exercises with living animals, experiments using organs or tissues from killed or slaughtered animals, application of diagnostic techniques in humans and theoretical seminars. In addition to these classical approaches computer programs for multimedia simulation of nerve, muscle and heart physiology are now a regular component of courses in Physiology at the School of Veterinary Medicine in Hannover. It is the aim of the present paper to give the first experiences about these new components.

5. Overview of the Helios Version 2.0 Computational Platform for Rotorcraft Simulations

DTIC Science & Technology

2011-01-01

Field, CA This article summarizes the capabilities and development of the Helios version 2.0, or Shasta, software for rotary wing simulations. Specific...the fuselage, rotors, flaps and stores. In addition, a new run-mode to handle maneuvering flight has been added. Fundamental changes of the Helios...for the NACA0015 wing , TRAM rotor in hover and the UH-60A in forward flight. I. Introduction Rotorcraft computations are challenging because they are

6. Simulation of chaos-assisted tunneling in a semiclassical regime on existing quantum computers

SciTech Connect

Chepelianskii, A.D.; Shepelyansky, D.L.

2002-11-01

We present a quantum algorithm that allows one to simulate chaos-assisted tunneling in deep semiclassical regime on existing quantum computers. This opens additional possibilities for investigation of macroscopic quantum tunneling and realization of semiclassical Schroedinger cat oscillations [E. Schroedinger, Naturwissenschaften 32, 807 (1935)]. Our numerical studies determine the decoherence rate induced by noisy gates for these oscillations and propose a suitable parameter regime for their experimental implementation.

7. Turbulence computations with 3-D small-scale additive turbulent decomposition and data-fitting using chaotic map combinations

SciTech Connect

Mukerji, Sudip

1997-01-01

Although the equations governing turbulent fluid flow, the Navier-Stokes (N.S.) equations, have been known for well over a century and there is a clear technological necessity in obtaining solutions to these equations, turbulence remains one of the principal unsolved problems in physics today. It is still not possible to make accurate quantitative predictions about turbulent flows without relying heavily on empirical data. In principle, it is possible to obtain turbulent solutions from a direct numerical simulation (DNS) of the N.-S. equations. The author first provides a brief introduction to the dynamics of turbulent flows. The N.-S. equations which govern fluid flow, are described thereafter. Then he gives a brief overview of DNS calculations and where they stand at present. He next introduces the two most popular approaches for doing turbulent computations currently in use, namely, the Reynolds averaging of the N.-S. equations (RANS) and large-eddy simulation (LES). Approximations, often ad hoc ones, are present in these methods because use is made of heuristic models for turbulence quantities (the Reynolds stresses) which are otherwise unknown. They then introduce a new computational method called additive turbulent decomposition (ATD), the small-scale version of which is the topic of this research. The rest of the thesis is organized as follows. In Chapter 2 he describes the ATD procedure in greater detail; how dependent variables are split and the decomposition into large- and small-scale sets of equations. In Chapter 3 the spectral projection of the small-scale momentum equations are derived in detail. In Chapter 4 results of the computations with the small-scale ATD equations are presented. In Chapter 5 he describes the data-fitting procedure which can be used to directly specify the parameters of a chaotic-map turbulence model.

8. Quantum game simulator, using the circuit model of quantum computation

Vlachos, Panagiotis; Karafyllidis, Ioannis G.

2009-10-01

We present a general two-player quantum game simulator that can simulate any two-player quantum game described by a 2×2 payoff matrix (two strategy games).The user can determine the payoff matrices for both players, their strategies and the amount of entanglement between their initial strategies. The outputs of the simulator are the expected payoffs of each player as a function of the other player's strategy parameters and the amount of entanglement. The simulator also produces contour plots that divide the strategy spaces of the game in regions in which players can get larger payoffs if they choose to use a quantum strategy against any classical one. We also apply the simulator to two well-known quantum games, the Battle of Sexes and the Chicken game. Program summaryProgram title: Quantum Game Simulator (QGS) Catalogue identifier: AEED_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEED_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3416 No. of bytes in distributed program, including test data, etc.: 583 553 Distribution format: tar.gz Programming language: Matlab R2008a (C) Computer: Any computer that can sufficiently run Matlab R2008a Operating system: Any system that can sufficiently run Matlab R2008a Classification: 4.15 Nature of problem: Simulation of two player quantum games described by a payoff matrix. Solution method: The program calculates the matrices that comprise the Eisert setup for quantum games based on the quantum circuit model. There are 5 parameters that can be altered. We define 3 of them as constant. We play the quantum game for all possible values for the other 2 parameters and store the results in a matrix. Unusual features: The software provides an easy way of simulating any two-player quantum games. Running time: Approximately

9. Gastric flow and mixing studied using computer simulation.

PubMed Central

Pal, Anupam; Indireshkumar, Keshavamurthy; Schwizer, Werner; Abrahamsson, Bertil; Fried, Michael; Brasseur, James G.

2004-01-01

The fed human stomach displays regular peristaltic contraction waves that originate in the proximal antrum and propagate to the pylorus. High-resolution concurrent manometry and magnetic resonance imaging (MRI) studies of the stomach suggest a primary function of antral contraction wave (ACW) activity unrelated to gastric emptying. Detailed evaluation is difficult, however, in vivo. Here we analyse the role of ACW activity on intragastric fluid motions, pressure, and mixing with computer simulation. A two-dimensional computer model of the stomach was developed with the 'lattice-Boltzmann' numerical method from the laws of physics, and stomach geometry modelled from MRI. Time changes in gastric volume were specified to match global physiological rates of nutrient liquid emptying. The simulations predicted two basic fluid motions: retrograde 'jets' through ACWs, and circulatory flow between ACWs, both of which contribute to mixing. A well-defined 'zone of mixing', confined to the antrum, was created by the ACWs, with mixing motions enhanced by multiple and narrower ACWs. The simulations also predicted contraction-induced peristaltic pressure waves in the distal antrum consistent with manometric measurements, but with a much lower pressure amplitude than manometric data, indicating that manometric pressure amplitudes reflect direct contact of the catheter with the gastric wall. We conclude that the ACWs are central to gastric mixing, and may also play an indirect role in gastric emptying through local alterations in common cavity pressure. PMID:15615685

10. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

SciTech Connect

Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio

2003-04-25

This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two gasifier types. An improved process model for simulating entrained flow gasifiers has been implemented into the workbench. Model development has focused on: a pre-processor module to compute global gasification parameters from standard fuel properties and intrinsic rate information; a membrane based water gas shift; and reactors to oxidize fuel cell exhaust gas. The data visualization capabilities of the workbench have been extended by implementing the VTK visualization software that supports advanced visualization methods, including inexpensive Virtual Reality techniques. The ease-of-use, functionality and plug-and-play features of the workbench were highlighted through demonstrations of the workbench at a DOE sponsored coal utilization conference. A white paper has been completed that contains recommendations on the use of component architectures, model interface protocols and software frameworks for developing a Vision 21 plant simulator.

11. Simulating Subsurface Reactive Flows on Ultrascale Computers with PFLOTRAN

Mills, R. T.; Hammond, G. E.; Lichtner, P. C.; Lu, C.; Smith, B. F.; Philip, B.

2009-12-01

To provide true predictive utility, subsurface simulations often must accurately resolve--in three dimensions--complicated, multi-phase flow fields in highly heterogeneous geology with numerous chemical species and complex chemistry. This task is especially daunting because of the wide range of spatial scales involved--from the pore scale to the field scale--ranging over six orders of magnitude, and the wide range of time scales ranging from seconds or less to millions of years. This represents a true "Grand Challenge" computational problem, requiring not only the largest-scale ("ultrascale") supercomputers, but accompanying advances in algorithms for the efficient numerical solution of systems of PDEs using these machines, and in mathematical modeling techniques that can adequately capture the truly multi-scale nature of these problems. We describe some of the specific challenges involved and present the software and algorithmic approaches that are being using in the computer code PFLOTRAN to provide scalable performance for such simulations on tens of thousands of processors. We focus particularly on scalable techniques for solving the large (up to billions of total degrees of freedom), sparse algebraic systems that arise. We also describe ongoing work to address disparate time and spatial scales by both the development of adaptive mesh refinement methods and the use of multiple continuum formulations. Finally, we present some examples from recent simulations conducted on Jaguar, the 150152 processor core Cray XT5 system at Oak Ridge National Laboratory that is currently one of the most powerful supercomputers in the world.

12. Benchmarking computational fluid dynamics models for lava flow simulation

Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi

2016-04-01

Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, and COMSOL. Using the new benchmark scenarios defined in Cordonnier et al. (Geol Soc SP, 2015) as a guide, we model viscous, cooling, and solidifying flows over horizontal and sloping surfaces, topographic obstacles, and digital elevation models of natural topography. We compare model results to analytical theory, analogue and molten basalt experiments, and measurements from natural lava flows. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We can apply these models to reconstruct past lava flows in Hawai'i and Saudi Arabia using parameters assembled from morphology, textural analysis, and eruption observations as natural test cases. Our study highlights the strengths and weaknesses of each code, including accuracy and computational costs, and provides insights regarding code selection.

13. Effects of isorhamnetin on tyrosinase: inhibition kinetics and computational simulation.

PubMed

Si, Yue-Xiu; Wang, Zhi-Jiang; Park, Daeui; Jeong, Hyoung Oh; Ye, Sen; Chung, Hae Young; Yang, Jun-Mo; Yin, Shang-Jun; Qian, Guo-Ying

2012-01-01

We studied the inhibitory effects of isorhamnetin on mushroom tyrosinase by inhibition kinetics and computational simulation. Isorhamnetin reversibly inhibited tyrosinase in a mixed-type manner at Ki=0.235±0.013 mM. Measurements of intrinsic and 1-anilinonaphthalene-8-sulfonate(ANS)-binding fluorescence showed that isorhamnetin did not induce significant changes in the tertiary structure of tyrosinase. To gain insight into the inactivation process, the kinetics were computed via time-interval measurements and continuous substrate reactions. The results indicated that inactivation induced by isorhamnetin was a first-order reaction with biphasic processes. To gain further insight, we simulated docking between tyrosinase and isorhamnetin. Simulation was successful (binding energies for Dock6.3: -32.58 kcal/mol, for AutoDock4.2: -5.66 kcal/mol, and for Fred2.2: -48.86 kcal/mol), suggesting that isorhamnetin interacts with several residues, such as HIS244 and MET280. This strategy of predicting tyrosinase interaction in combination with kinetics based on a flavanone compound might prove useful in screening for potential natural tyrosinase inhibitors.

14. Simulation of computed tomography dose based on voxel phantom

Liu, Chunyu; Lv, Xiangbo; Li, Zhaojun

2017-01-01

Computed Tomography (CT) is one of the preferred and the most valuable imaging tool used in diagnostic radiology, which provides a high-quality cross-sectional image of the body. It still causes higher doses of radiation to patients comparing to the other radiological procedures. The Monte-Carlo method is appropriate for estimation of the radiation dose during the CT examinations. The simulation of the Computed Tomography Dose Index (CTDI) phantom was developed in this paper. Under a similar conditions used in physical measurements, dose profiles were calculated and compared against the measured values that were reported. The results demonstrate a good agreement between the calculated and the measured doses. From different CT exam simulations using the voxel phantom, the highest absorbed dose was recorded for the lung, the brain, the bone surface. A comparison between the different scan type shows that the effective dose for a chest scan is the highest one, whereas the effective dose values during abdomen and pelvis scan are very close, respectively. The lowest effective dose resulted from the head scan. Although, the dose in CT is related to various parameters, such as the tube current, exposure time, beam energy, slice thickness and patient size, this study demonstrates that the MC simulation is a useful tool to accurately estimate the dose delivered to any specific organs for patients undergoing the CT exams and can be also a valuable technique for the design and the optimization of the CT x-ray source.

15. Computer simulation of the rodent spermatogonial stem cell niche.

PubMed

de Rooij, Dirk G; van Beek, Maria E A B

2013-05-01

A computer program has been developed that simulates the behavior of spermatogonial stem cells (SSCs) and their offspring inside and outside of the stem cell niche. Various parameters derived from previous morphological and cell kinetic studies have been used to set up an Excel-based computer program that simulates the proliferative activity of SSCs during the seminiferous epithelial cycle. SSCs and their offspring are depicted in a virtual piece of seminiferous tubule in which the daughter cells of self-renewing divisions of SSCs migrate away from each other, while after SSC differentiation a pair of cells is formed. Those SSC daughter cells that migrate out of the niche will very likely differentiate at their next division. Putting in physiologically acceptable parameters, the program renders numbers of spermatogonial cell types similar to those previously counted in whole mounts of seminiferous tubules. In this model, SSC numbers and numbers of differentiating cells remain constant for more than 50 virtual epithelial cycles, i.e., more than 1 yr of a mouse life and 2 yr of that of a Chinese hamster. The program can simulate various recent cell kinetic experiments and confirms, or offers alternative explanations for, the results obtained, showing its usefulness in spermatogenesis research.

16. Trace contaminant control simulation computer program, version 8.1

NASA Technical Reports Server (NTRS)

Perry, J. L.

1994-01-01

The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various process technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. Included in the simulation are chemical and physical adsorption by activated charcoal, chemical adsorption by lithium hydroxide, absorption by humidity condensate, and low- and high-temperature catalytic oxidation. Means are provided for simulating regenerable as well as nonregenerable systems. The program provides an overall mass balance of chemical contaminants in a spacecraft cabin given specified generation rates. Removal rates are based on device flow rates specified by the user and calculated removal efficiencies based on cabin concentration and removal technology experimental data. Versions 1.0 through 8.0 are documented in NASA TM-108409. TM-108409 also contains a source file listing for version 8.0. Changes to version 8.0 are documented in this technical memorandum and a source file listing for the modified version, version 8.1, is provided. Detailed descriptions for the computer program subprograms are extracted from TM-108409 and modified as necessary to reflect version 8.1. Version 8.1 supersedes version 8.0. Information on a separate user's guide is available from the author.

17. Textbook Multigrid Efficiency for Computational Fluid Dynamics Simulations

NASA Technical Reports Server (NTRS)

Brandt, Achi; Thomas, James L.; Diskin, Boris

2001-01-01

Considerable progress over the past thirty years has been made in the development of large-scale computational fluid dynamics (CFD) solvers for the Euler and Navier-Stokes equations. Computations are used routinely to design the cruise shapes of transport aircraft through complex-geometry simulations involving the solution of 25-100 million equations; in this arena the number of wind-tunnel tests for a new design has been substantially reduced. However, simulations of the entire flight envelope of the vehicle, including maximum lift, buffet onset, flutter, and control effectiveness have not been as successful in eliminating the reliance on wind-tunnel testing. These simulations involve unsteady flows with more separation and stronger shock waves than at cruise. The main reasons limiting further inroads of CFD into the design process are: (1) the reliability of turbulence models; and (2) the time and expense of the numerical simulation. Because of the prohibitive resolution requirements of direct simulations at high Reynolds numbers, transition and turbulence modeling is expected to remain an issue for the near term. The focus of this paper addresses the latter problem by attempting to attain optimal efficiencies in solving the governing equations. Typically current CFD codes based on the use of multigrid acceleration techniques and multistage Runge-Kutta time-stepping schemes are able to converge lift and drag values for cruise configurations within approximately 1000 residual evaluations. An optimally convergent method is defined as having textbook multigrid efficiency (TME), meaning the solutions to the governing system of equations are attained in a computational work which is a small (less than 10) multiple of the operation count in the discretized system of equations (residual equations). In this paper, a distributed relaxation approach to achieving TME for Reynolds-averaged Navier-Stokes (RNAS) equations are discussed along with the foundations that form the

18. Relative performances of several scientific computers for a liquid molecular dynamics simulation. [Computers tested are: VAX 11/70, CDC 7600, CRAY-1, CRAY-1*, VAX-FPSAP

SciTech Connect

Ceperley, D.M.

1980-08-01

Some of the computational characteristics of simulations and the author's experience in using his standard simulation program called CLAMPS on several scientific computers are discussed. CLAMPS is capable of performing Metropolis Monte Carlo and Molecular Dynamics simulations of arbitrary mixtures of single atoms. The computational characteristics of simulations and what makes a good simulation computer are also summarized.

19. A Computer-Based Simulation of an Acid-Base Titration

ERIC Educational Resources Information Center

Boblick, John M.

1971-01-01

Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

20. A digital computer program for the dynamic interaction simulation of controls and structure (DISCOS), volume 1

NASA Technical Reports Server (NTRS)

Bodley, C. S.; Devers, A. D.; Park, A. C.; Frisch, H. P.

1978-01-01

A theoretical development and associated digital computer program system for the dynamic simulation and stability analysis of passive and actively controlled spacecraft are presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system is used to investigate total system dynamic characteristics, including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. In addition, the program system is used for designing attitude control systems and for evaluating total dynamic system performance, including time domain response and frequency domain stability analyses.

1. Computer simulation of a backscattered X-ray fluorescence system.

PubMed

Al-Ghorabie, Fayez H H

2015-01-01

An EGSnrc user code is developed to simulate a backscattered geometry in vivo x-ray fluorescence system for the measurement of platinum concentration in head and neck tumours. The user code is fundamentally based on a previous study which used the EGS4 Monte Carlo code. The new user code, which we have developed in this study, has new improvements which made it able to simulate the process of photon transportation through the different components of the modelled x-ray fluorescence system. The simulation process included modelling of the photon source, collimators, phantoms and detector. Simulation results were compared and evaluated against x-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In vivo Analysis and Cancer Research Group. In addition, simulation results of this study were also compared with our previous study in which the EGS4 user code was used. Comparison between results has shown that the new EGSnrc user code was able to reproduce the spectral shape obtained using the experimental x-ray fluorescence system. The area under the Compton peak differs by 2.5% between the experimental measurement and the EGSnrc simulation. Similarly, the area under the two Pt Kα peaks differs by 2.3% and 2.2%.

2. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

PubMed Central

Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

2013-01-01

Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

3. Computational issues connected with 3D N-body simulations

Pfenniger, D.; Friedli, D.

1993-03-01

Computational problems related to modeling gravitational systems, and running and analyzing 3D N-body models are discussed. N-body simulations using Particle-Mesh techniques with polar grids are especially well-suited, and physically justified, when studying quiet evolutionary processes in disk galaxies. This technique allows large N, high central resolution, and is still the fastest one. Regardless of the method chosen to compute gravitation, softening is a compromise between HF amplification and resolution. Softened spherical and ellipsoidal kernels with variable resolution are set up. Detailed characteristics of the 3D polar grid, tests, code performances, and vectorization rates are also given. For integrating motion in rotating coordinates, a stable symplectic extension of the leap-frog algorithm is described. The technique used to search for periodic orbits in arbitrary N-body potentials and to determine their stability is explained.

4. Computational strategies in the dynamic simulation of constrained flexible MBS

NASA Technical Reports Server (NTRS)

Amirouche, F. M. L.; Xie, M.

1993-01-01

This research focuses on the computational dynamics of flexible constrained multibody systems. At first a recursive mapping formulation of the kinematical expressions in a minimum dimension as well as the matrix representation of the equations of motion are presented. The method employs Kane's equation, FEM, and concepts of continuum mechanics. The generalized active forces are extended to include the effects of high temperature conditions, such as creep, thermal stress, and elastic-plastic deformation. The time variant constraint relations for rolling/contact conditions between two flexible bodies are also studied. The constraints for validation of MBS simulation of gear meshing contact using a modified Timoshenko beam theory are also presented. The last part deals with minimization of vibration/deformation of the elastic beam in multibody systems making use of time variant boundary conditions. The above methodologies and computational procedures developed are being implemented in a program called DYAMUS.

5. Computational simulation of intermingled-fiber hybrid composite behavior

NASA Technical Reports Server (NTRS)

Mital, Subodh K.; Chamis, Christos C.

1992-01-01

Three-dimensional finite-element analysis and a micromechanics based computer code ICAN (Integrated Composite Analyzer) are used to predict the composite properties and microstresses of a unidirectional graphite/epoxy primary composite with varying percentages of S-glass fibers used as hydridizing fibers at a total fiber volume of 0.54. The three-dimensional finite-element model used in the analyses consists of a group of nine fibers, all unidirectional, in a three-by-three unit cell array. There is generally good agreement between the composite properties and microstresses obtained from both methods. The results indicate that the finite-element methods and the micromechanics equations embedded in the ICAN computer code can be used to obtain the properties of intermingled fiber hybrid composites needed for the analysis/design of hybrid composite structures. However, the finite-element model should be big enough to be able to simulate the conditions assumed in the micromechanics equations.

6. Computer simulation of hypothetical criticality accidents in aqueous fissile solutions

SciTech Connect

Hetrick, D.L. )

1991-01-01

The purpose of this paper is to describe recent developments in computer simulation of hypothetical criticality accidents in aqueous fissile solutions of uranium and plutonium such as might be encountered in fuel fabrication and reprocessing operations. Models for reactivity shutdown mechanisms and equations of state have been combined to permit estimates of fission yield, inertial pressure, and kinetic energy for a wide range of pulse sizes and time scales. Improvements to previously published models are reported along with some recent applications. Information obtained from pulsed solution assemblies (KEWB, CRAC, SILENE, and SHEBA) and from past criticality accidents was used in the development of computer models. Applications include slow events lasting many hours (hypothetical undetected laboratory accidents) and large-yield millisecond pulses in which evolution of radiolytic gas may be important (severe accidents and pulsed reactors).

7. Probabilistic lifetime strength of aerospace materials via computational simulation

NASA Technical Reports Server (NTRS)

Boyce, Lola; Keating, Jerome P.; Lovelace, Thomas B.; Bast, Callie C.

1991-01-01

The results of a second year effort of a research program are presented. The research included development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic phenomenological constitutive relationship, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects of primitive variables. These primitive variables often originate in the environment and may include stress from loading, temperature, chemical, or radiation attack. This multifactor interaction constitutive equation is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the constitutive equation using actual experimental materials data together with the multiple linear regression of that data.

8. Saturation in excitatory synapses of hippocampus investigated by computer simulations.

PubMed

Ventriglia, Francesco

2004-05-01

The standard view of the synaptic function in excitatory synapses has been deeply questioned by recent experimental data on hippocampal glutamate synapses both for possible receptor nonsaturation and for larger and non-Gaussian peak amplitude fluctuations. Our previous investigations of the mechanisms involved in the variability of the response of hippocampal glutamatergic synapses, carried out by computer simulation of simple Brownian models of glutamate diffusion, furnished initial evidence about their presynaptic character. A new, refined model, reported here, assumes a collision volume for the glutamate molecule and a more realistic description of receptors and their binding dynamics. Based on this model, conditions for AMPA and NMDA receptor saturation have been investigated and new miniature (or quantal) EPSC parameters have been computed. The results corroborate the hypothesis that the lack of AMPA and NMDA receptor saturation and the EPSC stochastic variability are attributable to the small volume of glutamatergic synaptic vesicles and hence to the small number of glutamate molecules diffusing in the cleft after a vesicle release. The investigations better characterize some not well-known elements of the synaptic structure, such as the fusion pore, and provide useful information on AMPA receptor dynamics. Indeed, a nice fit between computed EPSCs and some miniature EPSCs in recent experimental literature allowed for the computation of new transition time values among the different AMPA receptor states through a trial-and-error optimization procedure. Moreover, the model has been used to evaluate two hypotheses on the genesis of the long-term potentiation phenomenon.

9. Flight Simulation of Taketombo Based on Computational Fluid Dynamics and Computational Flight Dynamics

Kawamura, Kohei; Ueno, Yosuke; Nakamura, Yoshiaki

In the present study we have developed a numerical method to simulate the flight dynamics of a small flying body with unsteady motion, where both aerodynamics and flight dynamics are fully considered. A key point of this numerical code is to use computational fluid dynamics and computational flight dynamics at the same time, which is referred to as CFD2, or double CFDs, where several new ideas are adopted in the governing equations, the method to make each quantity nondimensional, and the coupling method between aerodynamics and flight dynamics. This numerical code can be applied to simulate the unsteady motion of small vehicles such as micro air vehicles (MAV). As a sample calculation, we take up Taketombo, or a bamboo dragonfly, and its free flight in the air is demonstrated. The eventual aim of this research is to virtually fly an aircraft with arbitrary motion to obtain aerodynamic and flight dynamic data, which cannot be taken in the conventional wind tunnel.

10. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

PubMed

Narayanan, C M; Banerjee, Arindam

2013-04-01

Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

11. Software Development Processes Applied to Computational Icing Simulation

NASA Technical Reports Server (NTRS)

Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

1999-01-01

The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

12. Simulation of computed radiography with imaging plate detectors

SciTech Connect

Tisseur, D.; Costin, M.; Mathy, F.; Schumm, A.

2014-02-18

Computed radiography (CR) using phosphor imaging plate detectors is taking an increasing place in Radiography Testing. CR uses similar equipment as conventional radiography except that the classical X-ray film is replaced by a numerical detector, called image plate (IP), which is made of a photostimulable layer and which is read by a scanning device through photostimulated luminescence. Such digital radiography has already demonstrated important benefits in terms of exposure time, decrease of source energies and thus reduction of radioprotection area besides being a solution without effluents. This paper presents a model for the simulation of radiography with image plate detectors in CIVA together with examples of validation of the model. The study consists in a cross comparison between experimental and simulation results obtained on a step wedge with a classical X-ray tube. Results are proposed in particular with wire Image quality Indicator (IQI) and duplex IQI.

13. Adaptive quantum computation in changing environments using projective simulation

PubMed Central

Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

2015-01-01

Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

14. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

SciTech Connect

Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio

2002-07-28

This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.

15. Developing adaptive QM/MM computer simulations for electrochemistry.

PubMed

Dohm, Sebastian; Spohr, Eckhard; Korth, Martin

2017-01-05

We report the development of adaptive QM/MM computer simulations for electrochemistry, providing public access to all sources via the free and open source software development model. We present a modular workflow-based MD simulation code as a platform for algorithms for partitioning space into different regions, which can be treated at different levels of theory on a per-timestep basis. Currently implemented algorithms focus on targeting molecules and their solvation layers relevant to electrochemistry. Instead of using built-in forcefields and quantum mechanical methods, the code features a universal interface, which allows for extension to a range of external forcefield programs and programs for quantum mechanical calculations, thus enabling the user to readily implement interfaces to those programs. The purpose of this article is to describe our codes and illustrate its usage. © 2016 Wiley Periodicals, Inc.

16. Computer simulations of heterologous immunity: highlights of an interdisciplinary cooperation.

PubMed

Calcagno, Claudia; Puzone, Roberto; Pearson, Yanthe E; Cheng, Yiming; Ghersi, Dario; Selin, Liisa K; Welsh, Raymond M; Celada, Franco

2011-06-01

The relationship between biological research and mathematical modeling is complex, critical, and vital. In this review, we summarize the results of the collaboration between two laboratories, exploring the interaction between mathematical modeling and wet-lab immunology. During this collaboration several aspects of the immune defence against viral infections were investigated, focusing primarily on the subject of heterologous immunity. In this manuscript, we emphasize the topics where computational simulations were applied in conjunction with experiments, such as immune attrition, the growing and shrinking of cross-reactive T cell repertoires following repeated infections, the short and long-term effects of cross-reactive immunological memory, and the factors influencing the appearance of new clonal specificities. For each topic, we describe how the mathematical model used was adapted to answer specific biological questions, and we discuss the hypotheses that were generated by simulations. Finally, we propose rules for testing hypotheses that emerge from model experimentation in the wet lab, and vice-versa.

17. Using computer simulations to study relativistic heavy ion collisions

Murray, Joelle Lynn

1998-12-01

One of the most exciting topics in high-energy nuclear physics is the study of the potential phase transition between hadronic and partonic matter. Information about this transition, if it exists and can be experimentally determined, would be vital in understanding confinement of quarks and gluons inside hadrons. New accelerators, RHIC and LIIC, will be online in the next few years and will focus on finding evidence for this transition. RHIC will collide Au on Au at center of mass energies equal to 200 GeV/nucleon and create a high density, high temperature state of matter. To study the large particle multiplicities that will occur at these experiments, computer simulations are being developed. Within this thesis, one type of simulation will be detailed and used to study the invariant mass spectrum of leptons pairs measured at CERN SPS and several hadronic observables that could be measured at RHIC.

18. Adaptive quantum computation in changing environments using projective simulation.

PubMed

Tiersch, M; Ganahl, E J; Briegel, H J

2015-08-11

Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent's learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent's performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover's search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

19. Adaptive quantum computation in changing environments using projective simulation

Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

2015-08-01

Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

20. Computational Strategies for Polymer Coated Steel Sheet Forming Simulations

SciTech Connect

Owen, D. R. J.; Andrade Pires, F. M.; Dutko, M.

2007-05-17

This contribution discusses current issues involved in the numerical simulation of large scale industrial forming processes that employ polymer coated steel sheet. The need for rigorous consideration of both theoretical and algorithmic issues is emphasized, particularly in relation to the computational treatment of finite strain deformation of polymer coated steel sheet in the presence of internal degradation. Other issues relevant to the effective treatment of the problem, including the modelling of frictional contact between the work piece and tools, low order element technology capable of dealing with plastic incompressibility and thermo mechanical coupling, are also addressed. The suitability of the overall approach is illustrated by the solution of an industrially relevant problem.

1. Computational Materials: Modeling and Simulation of Nanostructured Materials and Systems

NASA Technical Reports Server (NTRS)

Gates, Thomas S.; Hinkley, Jeffrey A.

2003-01-01

The paper provides details on the structure and implementation of the Computational Materials program at the NASA Langley Research Center. Examples are given that illustrate the suggested approaches to predicting the behavior and influencing the design of nanostructured materials such as high-performance polymers, composites, and nanotube-reinforced polymers. Primary simulation and measurement methods applicable to multi-scale modeling are outlined. Key challenges including verification and validation of models are highlighted and discussed within the context of NASA's broad mission objectives.

2. Tracking Non-rigid Structures in Computer Simulations

SciTech Connect

Gezahegne, A; Kamath, C

2008-01-10

A key challenge in tracking moving objects is the correspondence problem, that is, the correct propagation of object labels from one time step to another. This is especially true when the objects are non-rigid structures, changing shape, and merging and splitting over time. In this work, we describe a general approach to tracking thousands of non-rigid structures in an image sequence. We show how we can minimize memory requirements and generate accurate results while working with only two frames of the sequence at a time. We demonstrate our results using data from computer simulations of a fluimix problem.

3. Computer simulation of combination extrusion of ENAW1050A aluminum

Thomas, P.

2017-02-01

Computer simulation of the combination extrusion process for ENAW-1050A aluminum alloy is presented. The tests were carried out for three values of relative strain in forward direction ε1: 0.77, 0.69 and 0.59. For each value of relative strain ε1, three different values of strain in backward direction, ε2, were taken: 0.41, 0.52, 0.64. The effect of the relative strain degree on the development and values of the punch force was determined. It was demonstrated that the punch force increases with the increasing degree of relative strain in both forward and backward directions.

4. Computational Strategies for Polymer Coated Steel Sheet Forming Simulations

Owen, D. R. J.; Andrade Pires, F. M.; Dutko, M.

2007-05-01

This contribution discusses current issues involved in the numerical simulation of large scale industrial forming processes that employ polymer coated steel sheet. The need for rigorous consideration of both theoretical and algorithmic issues is emphasized, particularly in relation to the computational treatment of finite strain deformation of polymer coated steel sheet in the presence of internal degradation. Other issues relevant to the effective treatment of the problem, including the modelling of frictional contact between the work piece and tools, low order element technology capable of dealing with plastic incompressibility and thermo mechanical coupling, are also addressed. The suitability of the overall approach is illustrated by the solution of an industrially relevant problem.

5. COMPUTATIONAL SIMULATION OF REFRIGERATION PROCESS FOR BEPC II SUPERCONDUCTING FACILITIES.

SciTech Connect

WANG,L.JIA,L.X.DU,H.P.YANG,G.D.

2003-09-22

The main challenge to build the cryogenic system for the Beijing Electron-Positron Collider Upgrade is to accommodate the strong differences among three types of superconducting devices with regard to their structure, location, as well as the cryogenic operating requirement. Three kinds of cooling methods are applied in the overall cryogenic system, saturated liquid helium cooling for the SRF cavities, single-phase helium cooling for the SCQ magnets, and two-phase helium cooling for the SSM solenoid. The optimization for the BEPCII cryogenic system was carried out by using a large-scale computational simulation package. This paper presents thermal parameters and numerical analyses for the BEPCII cryogenic system.

6. Molecular Dynamics Computer Simulations of Multidrug RND Efflux Pumps.

PubMed

Ruggerone, Paolo; Vargiu, Attilio V; Collu, Francesca; Fischer, Nadine; Kandt, Christian

2013-01-01

Over-expression of multidrug efflux pumps of the Resistance Nodulation Division (RND) protein super family counts among the main causes for microbial resistance against pharmaceuticals. Understanding the molecular basis of this process is one of the major challenges of modern biomedical research, involving a broad range of experimental and computational techniques. Here we review the current state of RND transporter investigation employing molecular dynamics simulations providing conformational samples of transporter components to obtain insights into the functional mechanism underlying efflux pump-mediated antibiotics resistance in Escherichia coli and Pseudomonas aeruginosa.

7. Computer Simulation of Einstein-Podolsky-Rosen-Bohm Experiments

de Raedt, H.; Michielsen, K.

2016-07-01

We review an event-based simulation approach which reproduces the statistical distributions of quantum physics experiments by generating detection events one-by-one according to an unknown distribution and without solving a wave equation. Einstein-Podolsky-Rosen-Bohm laboratory experiments are used as an example to illustrate the applicability of this approach. It is shown that computer experiments that employ the same post-selection procedure as the one used in laboratory experiments produce data that is in excellent agreement with quantum theory.

8. Molecular Dynamics Computer Simulations of Multidrug RND Efflux Pumps

PubMed Central

Ruggerone, Paolo; Vargiu, Attilio V.; Collu, Francesca; Fischer, Nadine; Kandt, Christian

2013-01-01

Over-expression of multidrug efflux pumps of the Resistance Nodulation Division (RND) protein super family counts among the main causes for microbial resistance against pharmaceuticals. Understanding the molecular basis of this process is one of the major challenges of modern biomedical research, involving a broad range of experimental and computational techniques. Here we review the current state of RND transporter investigation employing molecular dynamics simulations providing conformational samples of transporter components to obtain insights into the functional mechanism underlying efflux pump-mediated antibiotics resistance in Escherichia coli and Pseudomonas aeruginosa. PMID:24688701

9. Computational strategies for three-dimensional flow simulations on distributed computer systems

NASA Technical Reports Server (NTRS)

Sankar, Lakshmi N.; Weed, Richard A.

1995-01-01

This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

10. Computational strategies for three-dimensional flow simulations on distributed computer systems

Sankar, Lakshmi N.; Weed, Richard A.

1995-08-01

This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

11. Extending a Flight Management Computer for Simulation and Flight Experiments

NASA Technical Reports Server (NTRS)

Madden, Michael M.; Sugden, Paul C.

2005-01-01

In modern transport aircraft, the flight management computer (FMC) has evolved from a flight planning aid to an important hub for pilot information and origin-to-destination optimization of flight performance. Current trends indicate increasing roles of the FMC in aviation safety, aviation security, increasing airport capacity, and improving environmental impact from aircraft. Related research conducted at the Langley Research Center (LaRC) often requires functional extension of a modern, full-featured FMC. Ideally, transport simulations would include an FMC simulation that could be tailored and extended for experiments. However, due to the complexity of a modern FMC, a large investment (millions of dollars over several years) and scarce domain knowledge are needed to create such a simulation for transport aircraft. As an intermediate alternative, the Flight Research Services Directorate (FRSD) at LaRC created a set of reusable software products to extend flight management functionality upstream of a Boeing-757 FMC, transparently simulating or sharing its operator interfaces. The paper details the design of these products and highlights their use on NASA projects.

12. A Framework to Simulate Semiconductor Devices Using Parallel Computer Architecture

Kumar, Gaurav; Singh, Mandeep; Bulusu, Anand; Trivedi, Gaurav

2016-10-01

Device simulations have become an integral part of semiconductor technology to address many issues (short channel effects, narrow width effects, hot-electron effect) as it goes into nano regime, helping us to continue further with the Moore's Law. TCAD provides a simulation environment to design and develop novel devices, thus a leap forward to study their electrical behaviour in advance. In this paper, a parallel 2D simulator for semiconductor devices using Discontinuous Galerkin Finite Element Method (DG-FEM) is presented. Discontinuous Galerkin (DG) method is used to discretize essential device equations and later these equations are analyzed by using a suitable methodology to find the solution. DG method is characterized to provide more accurate solution as it efficiently conserve the flux and easily handles complex geometries. OpenMP is used to parallelize solution of device equations on manycore processors and a speed of 1.4x is achieved during assembly process of discretization. This study is important for more accurate analysis of novel devices (such as FinFET, GAAFET etc.) on a parallel computing platform and will help us to develop a parallel device simulator which will be able to address this issue efficiently. A case study of PN junction diode is presented to show the effectiveness of proposed approach.

13. The operant reserve: a computer simulation in (accelerated) real time.

PubMed

Catania, A Charles

2005-05-31

In Skinner's Reflex Reserve theory, reinforced responses added to a reserve depleted by responding. It could not handle the finding that partial reinforcement generated more responding than continuous reinforcement, but it would have worked if its growth had depended not just on the last response but also on earlier responses preceding a reinforcer, each weighted by delay. In that case, partial reinforcement generates steady states in which reserve decrements produced by responding balance increments produced when reinforcers follow responding. A computer simulation arranged schedules for responses produced with probabilities proportional to reserve size. Each response subtracted a fixed amount from the reserve and added an amount weighted by the reciprocal of the time to the next reinforcer. Simulated cumulative records and quantitative data for extinction, random-ratio, random-interval, and other schedules were consistent with those of real performances, including some effects of history. The model also simulated rapid performance transitions with changed contingencies that did not depend on molar variables or on differential reinforcement of inter-response times. The simulation can be extended to inhomogeneous contingencies by way of continua of reserves arrayed along response and time dimensions, and to concurrent performances and stimulus control by way of different reserves created for different response classes.

14. Using Microcomputer Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

ERIC Educational Resources Information Center

Hart, Jeffrey A.

Examples of the use of computer simulations in two undergraduate courses, (American Foreign Policy and Introduction to International Politics), and a faculty computer literacy course on simulations and artificial intelligence, are provided in this compilation of various instructional items. A list of computer simulations available for various…

15. PAH growth initiated by propargyl addition: mechanism development and computational kinetics.

PubMed

Raj, Abhijeet; Al Rashidi, Mariam J; Chung, Suk Ho; Sarathy, S Mani

2014-04-24

Polycyclic aromatic hydrocarbon (PAH) growth is known to be the principal pathway to soot formation during fuel combustion, as such, a physical understanding of the PAH growth mechanism is needed to effectively assess, predict, and control soot formation in flames. Although the hydrogen abstraction C2H2 addition (HACA) mechanism is believed to be the main contributor to PAH growth, it has been shown to under-predict some of the experimental data on PAHs and soot concentrations in flames. This article presents a submechanism of PAH growth that is initiated by propargyl (C3H3) addition onto naphthalene (A2) and the naphthyl radical. C3H3 has been chosen since it is known to be a precursor of benzene in combustion and has appreciable concentrations in flames. This mechanism has been developed up to the formation of pyrene (A4), and the temperature-dependent kinetics of each elementary reaction has been determined using density functional theory (DFT) computations at the B3LYP/6-311++G(d,p) level of theory and transition state theory (TST). H-abstraction, H-addition, H-migration, β-scission, and intramolecular addition reactions have been taken into account. The energy barriers of the two main pathways (H-abstraction and H-addition) were found to be relatively small if not negative, whereas the energy barriers of the other pathways were in the range of (6-89 kcal·mol(-1)). The rates reported in this study may be extrapolated to larger PAH molecules that have a zigzag site similar to that in naphthalene, and the mechanism presented herein may be used as a complement to the HACA mechanism to improve prediction of PAH and soot formation.

16. Computer simulation of coal preparation plants. Part 2. User's manual. Final report

SciTech Connect

Gottfried, B.S.; Tierney, J.W.

1985-12-01

This report describes a comprehensive computer program that allows the user to simulate the performance of realistic coal preparation plants. The program is very flexible in the sense that it can accommodate any particular plant configuration that may be of interest. This allows the user to compare the performance of different plant configurations and to determine the impact of various modes of operation with the same configuration. In addition, the program can be used to assess the degree of cleaning obtained with different coal feeds for a given plant configuration and a given mode of operation. Use of the simulator requires that the user specify the appearance of the plant configuration, the plant operating conditions, and a description of the coal feed. The simulator will then determine the flowrates within the plant, and a description of each flowrate (i.e., the weight distribution, percent ash, pyritic sulfur and total sulfur, moisture, and Btu content). The simulation program has been written in modular form using the Fortran language. It can be implemented on a great many different types of computers, ranging from large scientific mainframes to IBM-type personal computers with a fixed disk. Some customization may be required, however, to ensure compatibility with the features of Fortran available on a particular computer. Part I of this report contains a general description of the methods used to carry out the simulation. Each of the major types of units is described separately, in addition to a description of the overall system analysis. Part II is intended as a user's manual. It contains a listing of the mainframe version of the program, instructions for its use (on both a mainframe and a microcomputer), and output for a representative sample problem.

17. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

NASA Technical Reports Server (NTRS)

Seltzer, S. M.

1974-01-01

Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

18. Laser Additive Melting and Solidification of Inconel 718: Finite Element Simulation and Experiment

2016-03-01

The field of powdered metal additive manufacturing is experiencing a surge in public interest finding uses in aerospace, defense, and biomedical industries. The relative youth of the technology coupled with public interest makes the field a vibrant research topic. The authors have expanded upon previously published finite element models used to analyze the processing of novel engineering materials through the use of laser- and electron beam-based additive manufacturing. In this work, the authors present a model for simulating fabrication of Inconel 718 using laser melting processes. Thermal transport phenomena and melt pool geometries are discussed and validation against experimental findings is presented. After comparing experimental and simulation results, the authors present two correction correlations to transform the modeling results into meaningful predictions of actual laser melting melt pool geometries in Inconel 718.

19. Deviated nasal septum hinders intranasal sprays: A computer simulation study

PubMed Central

Frank, Dennis O.; Kimbell, Julia S.; Cannon, Daniel; Pawar, Sachin S.; Rhee, John S.

2013-01-01

Background This study investigates how deviated nasal septum affects the quantity and distribution of spray particles, and examines the effects of inspiratory airflow and head position on particle transport. Methods Deposition of spray particles was analysed using a three-dimensional computational fluid dynamics model created from a computed tomography scan of a human nose with leftward septal deviation and a right inferior turbinate hypertrophy. Five simulations were conducted using Fluent™ software, with particle sizes ranging from 20-110μm, a spray speed of 3m/s, plume angle of 68°, and with steady state inspiratory airflow either present (15.7L/min) or absent at varying head positions. Results With inspiratory airflow present, posterior deposition on the obstructed side was approximately four times less than the contralateral side, regardless of head position, and was statistically significant (p<0.05). When airflow was absent, predicted deposition beyond the nasal valve on the left and right sides were between 16% and 69% lower and positively influenced by a dependent head position. Conclusions Simulations predicted that septal deviation significantly diminished drug delivery on the obstructed side. Furthermore, increased particle penetration was associated with presence of nasal airflow. Head position is an important factor in particle deposition patterns when inspiratory airflow is absent. PMID:22888490

20. A dual-frequency applied potential tomography technique: computer simulations.

PubMed

Griffiths, H; Ahmed, A

1987-01-01

Applied potential tomography has been discussed in relation to both static and dynamic imaging. We have investigated the feasibility of obtaining static images by measuring profiles at two frequencies of drive current to exploit the differing gradients of electrical conductivity with frequency for different tissues. This method has the advantages that no profile for the homogeneous medium is then needed, and the electrodes can be coupled directly to the skin. To demonstrate the principle, computer simulations have been carried out using published electrical parameters for mammalian tissues at frequencies of 100 and 150 kHz. The distribution of complex electric potentials was calculated by the successive over-relaxation method in two dimensions for an abdominal cross-section with 16 electrodes equally spaced around the surface. From the computed electrode potentials, images were reconstructed using a back-projection method (neglecting phase information). Liver and kidney appeared most distinctly on the image because of their comparatively large conductivity gradients. The perturbations in the electrode potential differences between the two frequencies had a mean value of 5%, requiring accurate measurement in a practical system, compared with 150% when the 100 kHz values were related to a simulation of homogeneous saline equal in conductivity to muscle. The perturbations could be increased by widening the separation of the frequencies. Static imaging using a dual-frequency technique appears to be feasible, but a more detailed consideration of the electrical properties of tissues is needed to determine the optimum choice of frequencies.