Science.gov

Sample records for addition computer simulations

  1. Calculators and Computers: Graphical Addition.

    ERIC Educational Resources Information Center

    Spero, Samuel W.

    1978-01-01

    A computer program is presented that generates problem sets involving sketching graphs of trigonometric functions using graphical addition. The students use calculators to sketch the graphs and a computer solution is used to check it. (MP)

  2. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  3. Computationally efficient multibody simulations

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  4. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  5. Accelerator simulation using computers

    SciTech Connect

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a multi-track'' simulation and analysis code can be used for these applications.

  6. Accelerator simulation using computers

    SciTech Connect

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ``multi-track`` simulation and analysis code can be used for these applications.

  7. Computer Simulation of Diffraction Patterns.

    ERIC Educational Resources Information Center

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  8. Computer-simulated phacoemulsification

    NASA Astrophysics Data System (ADS)

    Laurell, Carl-Gustaf; Nordh, Leif; Skarman, Eva; Andersson, Mats; Nordqvist, Per

    2001-06-01

    Phacoemulsification makes the cataract operation easier for the patient but involves a demanding technique for the surgeon. It is therefore important to increase the quality of surgical training in order to shorten the learning period for the beginner. This should diminish the risks of the patient. We are developing a computer-based simulator for training of phacoemulsification. The simulator is built on a platform that can be used as a basis for several different training simulators. A prototype has been made that has been partly tested by experienced surgeons.

  9. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1977-01-01

    In a computer simulation study of earthquakes a seismically active strike slip fault is represented by coupled mechanical blocks which are driven by a moving plate and which slide on a friction surface. Elastic forces and time independent friction are used to generate main shock events, while viscoelastic forces and time dependent friction add aftershock features. The study reveals that the size, length, and time and place of event occurrence are strongly influenced by the magnitude and degree of homogeneity in the elastic, viscous, and friction parameters of the fault region. For example, periodically reoccurring similar events are observed in simulations with near-homogeneous parameters along the fault, whereas seismic gaps are a common feature of simulations employing large variations in the fault parameters. The study also reveals correlations between strain energy release and fault length and average displacement and between main shock and aftershock displacements.

  10. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1976-01-01

    Two computer simulation models of earthquakes were studied for the dependence of the pattern of events on the model assumptions and input parameters. Both models represent the seismically active region by mechanical blocks which are connected to one another and to a driving plate. The blocks slide on a friction surface. In the first model elastic forces were employed and time independent friction to simulate main shock events. The size, length, and time and place of event occurrence were influenced strongly by the magnitude and degree of homogeniety in the elastic and friction parameters of the fault region. Periodically reoccurring similar events were frequently observed in simulations with near homogeneous parameters along the fault, whereas, seismic gaps were a common feature of simulations employing large variations in the fault parameters. The second model incorporated viscoelastic forces and time-dependent friction to account for aftershock sequences. The periods between aftershock events increased with time and the aftershock region was confined to that which moved in the main event.

  11. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  12. Intelligence Assessment with Computer Simulations

    ERIC Educational Resources Information Center

    Kroner, S.; Plass, J.L.; Leutner, D.

    2005-01-01

    It has been suggested that computer simulations may be used for intelligence assessment. This study investigates what relationships exist between intelligence and computer-simulated tasks that mimic real-world problem-solving behavior, and discusses design requirements that simulations have to meet in order to be suitable for intelligence…

  13. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  14. Massively parallel quantum computer simulator

    NASA Astrophysics Data System (ADS)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray X1E, a SGI Altix 3700 and clusters of PCs running Windows XP. We study the performance of the software by simulating quantum computers containing up to 36 qubits, using up to 4096 processors and up to 1 TB of memory. Our results demonstrate that the simulator exhibits nearly ideal scaling as a function of the number of processors and suggest that the simulation software described in this paper may also serve as benchmark for testing high-end parallel computers.

  15. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  16. Computer simulation of space charge

    NASA Astrophysics Data System (ADS)

    Yu, K. W.; Chung, W. K.; Mak, S. S.

    1991-05-01

    Using the particle-mesh (PM) method, a one-dimensional simulation of the well-known Langmuir-Child's law is performed on an INTEL 80386-based personal computer system. The program is coded in turbo basic (trademark of Borland International, Inc.). The numerical results obtained were in excellent agreement with theoretical predictions and the computational time required is quite modest. This simulation exercise demonstrates that some simple computer simulation using particles may be implemented successfully on PC's that are available today, and hopefully this will provide the necessary incentives for newcomers to the field who wish to acquire a flavor of the elementary aspects of the practice.

  17. Computer simulations of particle packing

    SciTech Connect

    Cesarano, J. III; McEuen, M.J.; Swiler, T.

    1996-09-01

    Computer code has been developed to rapidly simulate the random packing of disks and spheres in two and three dimensions. Any size distribution may be packed. The code simulates varying degrees of inter particle conditions ranging from sticky to free flowing. The code will also calculate the overall packing density, density distributions, and void size distributions (in two dimensions). An important aspect of the code is that it is written in C++ and incorporates a user-friendly graphical interface for standard Macintosh and Power PC platforms. Investigations as to how well the code simulates the realistic random packing have begun. The code has been developed in consideration of the problem of filling a container (or die) with spray-dried granules of ceramic powder (represented by spheres). Although not presented here, the futuristic goal of this work is to give users the ability to predict homogeneity of filled dies prior to dry pressing. Additionally, this software has educational utility for studying relationships between particle size distributions and macrostructures.

  18. Computer Simulation of Mutagenesis.

    ERIC Educational Resources Information Center

    North, J. C.; Dent, M. T.

    1978-01-01

    A FORTRAN program is described which simulates point-substitution mutations in the DNA strands of typical organisms. Its objective is to help students to understand the significance and structure of the genetic code, and the mechanisms and effect of mutagenesis. (Author/BB)

  19. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  20. Computer Simulation and Library Management.

    ERIC Educational Resources Information Center

    Main, Linda

    1992-01-01

    Reviews the literature on computer simulation modeling for library management and examines whether simulation is underutilized because the models are too complex and mathematical. Other problems with implementation are considered, including incomplete problem definition, lack of a conceptual framework, system constraints, lack of interaction…

  1. Plasma physics via computer simulation

    SciTech Connect

    Birdsall, C.K.; Langdon, A.B.

    1985-01-01

    This book describes the computerized simulation of plasma kinetics. Topics considered include why attempting to do plasma physics via computer simulation using particles makes good physical sense; overall view of a one-dimensional electrostatic program; a one-dimensional electrostatic program; introduction to the numerical methods used; a 1d electromagnetic program; projects for EM1; effects of the spatial grid; effects of the finite time step; energy-conserving simulation models; multipole models; kinetic theory for fluctuations and noise; collisions; statistical mechanics of a sheet plasma; electrostatic programs in two and three dimensions; electromagnetic programs in 2D and 3D; design of computer experiments; and the choice of parameters.

  2. Composite Erosion by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    Composite degradation is evaluated by computational simulation when the erosion degradation occurs on a ply-by-ply basis and the degrading medium (device) is normal to the ply. The computational simulation is performed by a multi factor interaction model and by a multi scale and multi physics available computer code. The erosion process degrades both the fiber and the matrix simultaneously in the same slice (ply). Both the fiber volume ratio and the matrix volume ratio approach zero while the void volume ratio increases as the ply degrades. The multi factor interaction model simulates the erosion degradation, provided that the exponents and factor ratios are selected judiciously. Results obtained by the computational composite mechanics show that most composite characterization properties degrade monotonically and approach "zero" as the ply degrades completely.

  3. Computer simulation for the growing probability of additional offspring with an advantageous reversal allele in the decoupled continuous-time mutation-selection model

    NASA Astrophysics Data System (ADS)

    Gill, Wonpyong

    2016-01-01

    This study calculated the growing probability of additional offspring with the advantageous reversal allele in an asymmetric sharply-peaked landscape using the decoupled continuous-time mutation-selection model. The growing probability was calculated for various population sizes, N, sequence lengths, L, selective advantages, s, fitness parameters, k and measuring parameters, C. The saturated growing probability in the stochastic region was approximately the effective selective advantage, s*, when C≫1/Ns* and s*≪1. The present study suggests that the growing probability in the stochastic region in the decoupled continuous-time mutation-selection model can be described using the theoretical formula for the growing probability in the Moran two-allele model. The selective advantage ratio, which represents the ratio of the effective selective advantage to the selective advantage, does not depend on the population size, selective advantage, measuring parameter and fitness parameter; instead the selective advantage ratio decreases with the increasing sequence length.

  4. Computer Simulation of Aircraft Aerodynamics

    NASA Technical Reports Server (NTRS)

    Inouye, Mamoru

    1989-01-01

    The role of Ames Research Center in conducting basic aerodynamics research through computer simulations is described. The computer facilities, including supercomputers and peripheral equipment that represent the state of the art, are described. The methodology of computational fluid dynamics is explained briefly. Fundamental studies of turbulence and transition are being pursued to understand these phenomena and to develop models that can be used in the solution of the Reynolds-averaged Navier-Stokes equations. Four applications of computer simulations for aerodynamics problems are described: subsonic flow around a fuselage at high angle of attack, subsonic flow through a turbine stator-rotor stage, transonic flow around a flexible swept wing, and transonic flow around a wing-body configuration that includes an inlet and a tail.

  5. Taxis through Computer Simulation Programs.

    ERIC Educational Resources Information Center

    Park, David

    1983-01-01

    Describes a sequence of five computer programs (listings for Apple II available from author) on tactic responses (oriented movement of a cell, cell group, or whole organism in reponse to stimuli). The simulation programs are useful in helping students examine mechanisms at work in real organisms. (JN)

  6. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  7. Enabling Computational Technologies for Terascale Scientific Simulations

    SciTech Connect

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  8. Computer simulation of martensitic transformations

    SciTech Connect

    Xu, Ping

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  9. Temperature estimators in computer simulation

    NASA Astrophysics Data System (ADS)

    Jara, César; González-Cataldo, Felipe; Davis, Sergio; Gutiérrez, Gonzalo

    2016-05-01

    Temperature is a key physical quantity that is used to describe equilibrium between two bodies in thermal contact. In computer simulations, the temperature is usually estimated by means of the equipartition theorem, as an average over the kinetic energy. However, recent studies have shown that the temperature can be estimated using only the particles positions, which has been called configurational temperature. Through classical molecular dynamics simulations of 108-argon-atoms system, we compare the performance of four different temperature estimators: the usual kinetic temperature and three configurational temperatures, Our results show that the different estimators converge to the same value, but their fluctuations are different.

  10. Using Computational Simulations to Confront Students' Mental Models

    ERIC Educational Resources Information Center

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  11. Computer Simulations of Space Plasmas

    NASA Astrophysics Data System (ADS)

    Goertz, C. K.

    Even a superficial scanning of the latest issues of the Journal of Geophysical Research reveals that numerical simulation of space plasma processes is an active and growing field. The complexity and sophistication of numerically produced “data” rivals that of the real stuff. Sometimes numerical results need interpretation in terms of a simple “theory,” very much as the results of real experiments and observations do. Numerical simulation has indeed become a third independent tool of space physics, somewhere between observations and analytic theory. There is thus a strong need for textbooks and monographs that report the latest techniques and results in an easily accessible form. This book is an attempt to satisfy this need. The editors want it not only to be “proceedings of selected lectures (given) at the first ISSS (International School of Space Simulations in Kyoto, Japan, November 1-2, 1982) but rather…a form of textbook of computer simulations of space plasmas.” This is, of course, a difficult task when many authors are involved. Unavoidable redundancies and differences in notation may confuse the beginner. Some important questions, like numerical stability, are not discussed in sufficient detail. The recent book by C.K. Birdsall and A.B. Langdon (Plasma Physics via Computer Simulations, McGraw-Hill, New York, 1985) is more complete and detailed and seems more suitable as a textbook for simulations. Nevertheless, this book is useful to the beginner and the specialist because it contains not only descriptions of various numerical techniques but also many applications of simulations to space physics phenomena.

  12. Biomes computed from simulated climatologies

    SciTech Connect

    Claussen, M.; Esch, M.

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

  13. Computer simulation of nonequilibrium processes

    SciTech Connect

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.

  14. Computer simulation of engine systems

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1980-01-01

    The use of computerized simulations of the steady state and transient performance of jet engines throughout the flight regime is discussed. In addition, installation effects on thrust and specific fuel consumption is accounted for as well as engine weight, dimensions and cost. The availability throughout the government and industry of analytical methods for calculating these quantities are pointed out.

  15. Inversion based on computational simulations

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  16. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  17. Computer Simulation for Emergency Incident Management

    SciTech Connect

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  18. Computer simulation of microstructural dynamics

    SciTech Connect

    Grest, G.S.; Anderson, M.P.; Srolovitz, D.J.

    1985-01-01

    Since many of the physical properties of materials are determined by their microstructure, it is important to be able to predict and control microstructural development. A number of approaches have been taken to study this problem, but they assume that the grains can be described as spherical or hexagonal and that growth occurs in an average environment. We have developed a new technique to bridge the gap between the atomistic interactions and the macroscopic scale by discretizing the continuum system such that the microstructure retains its topological connectedness, yet is amenable to computer simulations. Using this technique, we have studied grain growth in polycrystalline aggregates. The temporal evolution and grain morphology of our model are in excellent agreement with experimental results for metals and ceramics.

  19. Priority Queues for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.

  20. Computer simulations of liquid crystals

    NASA Astrophysics Data System (ADS)

    Smondyrev, Alexander M.

    Liquid crystal physics is an exciting interdisciplinary field of research with important practical applications. Their complexity and the presence of strong translational and orientational fluctuations require a computational approach, especially in the studies of nonequlibrium phenomena. In this dissertation we present the results of computer simulation studies of liquid crystals using the molecular dynamics technique. We employed the Gay-Berne phenomenological model of liquid crystals to describe the interaction between the molecules. Both equilibrium and non-equilibrium phenomena were studied. In the first case we studied the flow properties of the liquid crystal system in equilibrium as well as the dynamics of the director. We measured the viscosities of the Gay-Berne model in the nematic and isotropic phases. The temperature-dependence of the rotational and shear viscosities, including the nonmonotonic behavior of one shear viscosity, are in good agreement with experimental data. The bulk viscosities are significantly larger than the shear viscosities, again in agreement with experiment. The director motion was found to be ballistic at short times and diffusive at longer times. The second class of problems we focused on is the properties of the system which was rapidly quenched to very low temperatures from the nematic phase. We find a glass transition to a metastable phase with nematic order and frozen translational and orientational degrees of freedom. For fast quench rates the local structure is nematic-like, while for slower quench rates smectic order is present as well. Finally, we considered a system in the isotropic phase which is then cooled to temperatures below the isotropic-nematic transition temperature. We expect topological defects to play a central role in the subsequent equilibration of the system. To identify and study these defects we require a simulation of a system with several thousand particles. We present the results of large

  1. The Shuttle Mission Simulator computer generated imagery

    NASA Technical Reports Server (NTRS)

    Henderson, T. H.

    1984-01-01

    Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.

  2. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  3. Space Ultrareliable Modular Computer (SUMC) instruction simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1972-01-01

    The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.

  4. Simulating Drosophila Genetics with the Computer.

    ERIC Educational Resources Information Center

    Small, James W., Jr.; Edwards, Kathryn L.

    1979-01-01

    Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)

  5. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  6. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1982-01-01

    Chip-level modeling techniques in the evaluation of fault tolerant systems were researched. A fault tolerant computer was modeled. An efficient approach to functional fault simulation was developed. Simulation software was also developed.

  7. Monte Carlo Computer Simulation of a Rainbow.

    ERIC Educational Resources Information Center

    Olson, Donald; And Others

    1990-01-01

    Discusses making a computer-simulated rainbow using principles of physics, such as reflection and refraction. Provides BASIC program for the simulation. Appends a program illustrating the effects of dispersion of the colors. (YP)

  8. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  9. Constructivist Design of Graphic Computer Simulations.

    ERIC Educational Resources Information Center

    Black, John B.; And Others

    Two graphic computer simulations have been prepared for teaching high school and middle school students about how business organizations and financial systems work: "Parkside," which simulates managing a hotel; and "Guestwear," which simulates managing a clothing manufacturer. Both simulations are based on six principles of constructivist design…

  10. Parallel Proximity Detection for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1998-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  11. Parallel Proximity Detection for Computer Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1997-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  12. Computer Simulation in Chemical Kinetics

    ERIC Educational Resources Information Center

    Anderson, Jay Martin

    1976-01-01

    Discusses the use of the System Dynamics technique in simulating a chemical reaction for kinetic analysis. Also discusses the use of simulation modelling in biology, ecology, and the social sciences, where experimentation may be impractical or impossible. (MLH)

  13. Additional development of the XTRAN3S computer program

    NASA Technical Reports Server (NTRS)

    Borland, C. J.

    1989-01-01

    Additional developments and enhancements to the XTRAN3S computer program, a code for calculation of steady and unsteady aerodynamics, and associated aeroelastic solutions, for 3-D wings in the transonic flow regime are described. Algorithm improvements for the XTRAN3S program were provided including an implicit finite difference scheme to enhance the allowable time step and vectorization for improved computational efficiency. The code was modified to treat configurations with a fuselage, multiple stores/nacelles/pylons, and winglets. Computer program changes (updates) for error corrections and updates for version control are provided.

  14. Computer simulation of CPM dye lasers

    SciTech Connect

    Wang Qingyue; Zhao Xingjun )

    1990-01-01

    Quantative analysis of the laser pulses of various intracavity elements in a CPM dye laser is carried out in this study. The pulse formation is simulated with a computer, resulting in an asymmetric numerical solution for the pulse shape. The mechanisms of pulse formation are also discussed based on the results of computer simulation.

  15. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  16. A computer simulation of chromosomal instability

    NASA Astrophysics Data System (ADS)

    Goodwin, E.; Cornforth, M.

    The transformation of a normal cell into a cancerous growth can be described as a process of mutation and selection occurring within the context of clonal expansion. Radiation, in addition to initial DNA damage, induces a persistent and still poorly understood genomic instability process that contributes to the mutational burden. It will be essential to include a quantitative description of this phenomenon in any attempt at science-based risk assessment. Monte Carlo computer simulations are a relatively simple way to model processes that are characterized by an element of randomness. A properly constructed simulation can capture the essence of a phenomenon that, as is often the case in biology, can be extraordinarily complex, and can do so even though the phenomenon itself is incompletely understood. A simple computer simulation of one manifestation of genomic instability known as chromosomal instability will be presented. The model simulates clonal expansion of a single chromosomally unstable cell into a colony. Instability is characterized by a single parameter, the rate of chromosomal rearrangement. With each new chromosome aberration, a unique subclone arises (subclones are defined as having a unique karyotype). The subclone initially has just one cell, but it can expand with cell division if the aberration is not lethal. The computer program automatically keeps track of the number of subclones within the expanding colony, and the number of cells within each subclone. Because chromosome aberrations kill some cells during colony growth, colonies arising from unstable cells tend to be smaller than those arising from stable cells. For any chosen level of instability, the computer program calculates the mean number of cells per colony averaged over many runs. These output should prove useful for investigating how such radiobiological phenomena as slow growth colonies, increased doubling time, and delayed cell death depend on chromosomal instability. Also of

  17. VLSI circuit simulation using a vector computer

    NASA Technical Reports Server (NTRS)

    Mcgrogan, S. K.

    1984-01-01

    Simulation of circuits having more than 2000 active devices requires the largest, fastest computers available. A vector computer, such as the CYBER 205, can yield great speed and cost advantages if efforts are made to adapt the simulation program to the strengths of the computer. ASPEC and SPICE (1), two widely used circuit simulation programs, are discussed. ASPECV and VAMOS (5) are respectively vector adaptations of these two simulators. They demonstrate the substantial performance enhancements possible for this class of algorithm on the CYBER 205.

  18. Multiscale Computer Simulation of Failure in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  19. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.

    1973-01-01

    A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.

  20. Analyzing Robotic Kinematics Via Computed Simulations

    NASA Technical Reports Server (NTRS)

    Carnahan, Timothy M.

    1992-01-01

    Computing system assists in evaluation of kinematics of conceptual robot. Displays positions and motions of robotic manipulator within work cell. Also displays interactions between robotic manipulator and other objects. Results of simulation displayed on graphical computer workstation. System includes both off-the-shelf software originally developed for automotive industry and specially developed software. Simulation system also used to design human-equivalent hand, to model optical train in infrared system, and to develop graphical interface for teleoperator simulation system.

  1. Computationally Lightweight Air-Traffic-Control Simulation

    NASA Technical Reports Server (NTRS)

    Knight, Russell

    2005-01-01

    An algorithm for computationally lightweight simulation of automated air traffic control (ATC) at a busy airport has been derived. The algorithm is expected to serve as the basis for development of software that would be incorporated into flight-simulator software, the ATC component of which is not yet capable of handling realistic airport loads. Software based on this algorithm could also be incorporated into other computer programs that simulate a variety of scenarios for purposes of training or amusement.

  2. Frontiers in the Teaching of Physiology. Computer Literacy and Simulation.

    ERIC Educational Resources Information Center

    Tidball, Charles S., Ed.; Shelesnyak, M. C., Ed.

    Provided is a collection of papers on computer literacy and simulation originally published in The Physiology Teacher, supplemented by additional papers and a glossary of terms relevant to the field. The 12 papers are presented in five sections. An affirmation of conventional physiology laboratory exercises, coping with computer terminology, and…

  3. Computer Clinical Simulations in Health Sciences.

    ERIC Educational Resources Information Center

    Jones, Gary L; Keith, Kenneth D.

    1983-01-01

    Discusses the key characteristics of clinical simulation, some developmental foundations, two current research studies, and some implications for the future of health science education. Investigations of the effects of computer-based simulation indicate that acquisition of decision-making skills is greater than with noncomputerized simulations.…

  4. Computer simulation of nonequilibrium processes

    SciTech Connect

    Hoover, W.G.; Moran, B.; Holian, B.L.; Posch, H.A.; Bestiale, S.

    1987-01-01

    Recent atomistic simulations of irreversible macroscopic hydrodynamic flows are illustrated. An extension of Nose's reversible atomistic mechanics makes it possible to simulate such non-equilibrium systems with completely reversible equations of motion. The new techniques show that macroscopic irreversibility is a natural inevitable consequence of time-reversible Lyapunov-unstable microscopic equations of motion.

  5. Filtration theory using computer simulations

    SciTech Connect

    Bergman, W.; Corey, I.

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  6. Evaluation of Visual Computer Simulator for Computer Architecture Education

    ERIC Educational Resources Information Center

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  7. Computational simulation methods for composite fracture mechanics

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    1988-01-01

    Structural integrity, durability, and damage tolerance of advanced composites are assessed by studying damage initiation at various scales (micro, macro, and global) and accumulation and growth leading to global failure, quantitatively and qualitatively. In addition, various fracture toughness parameters associated with a typical damage and its growth must be determined. Computational structural analysis codes to aid the composite design engineer in performing these tasks were developed. CODSTRAN (COmposite Durability STRuctural ANalysis) is used to qualitatively and quantitatively assess the progressive damage occurring in composite structures due to mechanical and environmental loads. Next, methods are covered that are currently being developed and used at Lewis to predict interlaminar fracture toughness and related parameters of fiber composites given a prescribed damage. The general purpose finite element code MSC/NASTRAN was used to simulate the interlaminar fracture and the associated individual as well as mixed-mode strain energy release rates in fiber composites.

  8. Computer simulation of upset welding

    SciTech Connect

    Spingarn, J R; Mason, W E; Swearengen, J C

    1982-04-01

    Useful process modeling of upset welding requires contributions from metallurgy, welding engineering, thermal analysis and experimental mechanics. In this report, the significant milestones for such an effort are outlined and probable difficult areas are pointed out. Progress to date is summarized and directions for future research are offered. With regard to the computational aspects of this problem, a 2-D heat conduction computer code has been modified to incorporate electrical heating, and computations have been run for an axisymmetric problem with simple viscous material laws and d.c. electrical boundary conditions. In the experimental endeavor, the boundary conditions have been measured during the welding process, although interpretation of voltage drop measurements is not straightforward. The ranges of strain, strain rate and temperature encountered during upset welding have been measured or calculated, and the need for a unifying constitutive law is described. Finally, the possible complications of microstructure and interfaces are clarified.

  9. Computational Spectrum of Agent Model Simulation

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    The study of human social behavioral systems is finding renewed interest in military, homeland security and other applications. Simulation is the most generally applied approach to studying complex scenarios in such systems. Here, we outline some of the important considerations that underlie the computational aspects of simulation-based study of human social systems. The fundamental imprecision underlying questions and answers in social science makes it necessary to carefully distinguish among different simulation problem classes and to identify the most pertinent set of computational dimensions associated with those classes. We identify a few such classes and present their computational implications. The focus is then shifted to the most challenging combinations in the computational spectrum, namely, large-scale entity counts at moderate to high levels of fidelity. Recent developments in furthering the state-of-the-art in these challenging cases are outlined. A case study of large-scale agent simulation is provided in simulating large numbers (millions) of social entities at real-time speeds on inexpensive hardware. Recent computational results are identified that highlight the potential of modern high-end computing platforms to push the envelope with respect to speed, scale and fidelity of social system simulations. Finally, the problem of shielding the modeler or domain expert from the complex computational aspects is discussed and a few potential solution approaches are identified.

  10. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  11. Astronomy Simulation with Computer Graphics.

    ERIC Educational Resources Information Center

    Thomas, William E.

    1982-01-01

    "Planetary Motion Simulations" is a system of programs designed for students to observe motions of a superior planet (one whose orbit lies outside the orbit of the earth). Programs run on the Apple II microcomputer and employ high-resolution graphics to present the motions of Saturn. (Author/JN)

  12. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  13. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  14. Computer Simulation of F=m/a.

    ERIC Educational Resources Information Center

    Hayden, Howard C.

    1984-01-01

    Discusses a computer simulation which: (1) describes an experiment investigating F=m/a; (2) generates data; (3) allows students to see the data; and (4) generates the equation with a least-squares fit. (JN)

  15. Computer simulation and optimization of radioelectronic devices

    NASA Astrophysics Data System (ADS)

    Benenson, Z. M.; Elistratov, M. R.; Ilin, L. K.; Kravchenko, S. V.; Sukhov, D. M.; Udler, M. A.

    Methods of simulating and optimizing radioelectronic devices in an automated design system are discussed. Also treated are algorithms used in the computer-aided design of these devices. The special language of description for these devices is described.

  16. Computer-simulated phacoemulsification improvements

    NASA Astrophysics Data System (ADS)

    Soederberg, Per G.; Laurell, Carl-Gustaf; Artzen, D.; Nordh, Leif; Skarman, Eva; Nordqvist, P.; Andersson, Mats

    2002-06-01

    A simulator for phacoemulsification cataract extraction is developed. A three-dimensional visual interface and foot pedals for phacoemulsification power, x-y positioning, zoom and focus were established. An algorithm that allows real time visual feedback of the surgical field was developed. Cataract surgery is the most common surgical procedure. The operation requires input from both feet and both hands and provides visual feedback through the operation microscope essentially without tactile feedback. Experience demonstrates that the number of complications for an experienced surgeon learning phacoemulsification, decreases exponentially, reaching close to the asymptote after the first 500 procedures despite initial wet lab training on animal eyes. Simulator training is anticipated to decrease training time, decrease complication rate for the beginner and reduce expensive supervision by a high volume surgeon.

  17. Teaching Environmental Systems Modelling Using Computer Simulation.

    ERIC Educational Resources Information Center

    Moffatt, Ian

    1986-01-01

    A computer modeling course in environmental systems and dynamics is presented. The course teaches senior undergraduates to analyze a system of interest, construct a system flow chart, and write computer programs to simulate real world environmental processes. An example is presented along with a course evaluation, figures, tables, and references.…

  18. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  19. Computer Simulation Of A Small Turboshaft Engine

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.

    1991-01-01

    Component-type mathematical model of small turboshaft engine developed for use in real-time computer simulations of dynamics of helicopter flight. Yields shaft speeds, torques, fuel-consumption rates, and other operating parameters with sufficient accuracy for use in real-time simulation of maneuvers involving large transients in power and/or severe accelerations.

  20. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  1. Criterion Standards for Evaluating Computer Simulation Courseware.

    ERIC Educational Resources Information Center

    Wholeben, Brent Edward

    This paper explores the role of computerized simulations as a decision-modeling intervention strategy, and views the strategy's different attribute biases based upon the varying primary missions of instruction versus application. The common goals associated with computer simulations as a training technique are discussed and compared with goals of…

  2. Salesperson Ethics: An Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  3. Understanding Islamist political violence through computational social simulation

    SciTech Connect

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G; Eberhardt, Ariane; Stradling, Seth G

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  4. Computer simulation of gear tooth manufacturing processes

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  5. Simulation of Laser Additive Manufacturing and its Applications

    NASA Astrophysics Data System (ADS)

    Lee, Yousub

    Laser and metal powder based additive manufacturing (AM), a key category of advanced Direct Digital Manufacturing (DDM), produces metallic components directly from a digital representation of the part such as a CAD file. It is well suited for the production of high-value, customizable components with complex geometry and the repair of damaged components. Currently, the main challenges for laser and metal powder based AM include the formation of defects (e.g., porosity), low surface finish quality, and spatially non-uniform properties of material. Such challenges stem largely from the limited knowledge of complex physical processes in AM especially the molten pool physics such as melting, molten metal flow, heat conduction, vaporization of alloying elements, and solidification. Direct experimental measurement of melt pool phenomena is highly difficult since the process is localized (on the order of 0.1 mm to 1 mm melt pool size) and transient (on the order of 1 m/s scanning speed). Furthermore, current optical and infrared cameras are limited to observe the melt pool surface. As a result, fluid flows in the melt pool, melt pool shape and formation of sub-surface defects are difficult to be visualized by experiment. On the other hand, numerical simulation, based on rigorous solution of mass, momentum and energy transport equations, can provide important quantitative knowledge of complex transport phenomena taking place in AM. The overarching goal of this dissertation research is to develop an analytical foundation for fundamental understanding of heat transfer, molten metal flow and free surface evolution. Two key types of laser AM processes are studied: a) powder injection, commonly used for repairing of turbine blades, and b) powder bed, commonly used for manufacturing of new parts with complex geometry. In the powder injection simulation, fluid convection, temperature gradient (G), solidification rate (R) and melt pool shape are calculated using a heat transfer

  6. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Weatherbee, J. E.; Taylor, D. S.

    1972-01-01

    A deterministic digital simulation model is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Use of the model as a tool in configuring a minimum computer system for a typical mission is demonstrated. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources, i.e., the configuration derived is a minimal one. Other considerations such as increased reliability through the use of standby spares would be taken into account in the definition of a practical system for a given mission.

  7. Computer simulation of bubble formation.

    SciTech Connect

    Insepov, Z.; Bazhirov, T.; Norman, G.; Stegailov, V.; Mathematics and Computer Science; Institute for High Energy Densities of Joint Institute for High Temperatures of RAS

    2007-01-01

    Properties of liquid metals (Li, Pb, Na) containing nanoscale cavities were studied by atomistic Molecular Dynamics (MD). Two atomistic models of cavity simulation were developed that cover a wide area in the phase diagram with negative pressure. In the first model, the thermodynamics of cavity formation, stability and the dynamics of cavity evolution in bulk liquid metals have been studied. Radial densities, pressures, surface tensions, and work functions of nano-scale cavities of various radii were calculated for liquid Li, Na, and Pb at various temperatures and densities, and at small negative pressures near the liquid-gas spinodal, and the work functions for cavity formation in liquid Li were calculated and compared with the available experimental data. The cavitation rate can further be obtained by using the classical nucleation theory (CNT). The second model is based on the stability study and on the kinetics of cavitation of the stretched liquid metals. A MD method was used to simulate cavitation in a metastable Pb and Li melts and determine the stability limits. States at temperatures below critical (T < 0.5Tc) and large negative pressures were considered. The kinetic boundary of liquid phase stability was shown to be different from the spinodal. The kinetics and dynamics of cavitation were studied. The pressure dependences of cavitation frequencies were obtained for several temperatures. The results of MD calculations were compared with estimates based on classical nucleation theory.

  8. Computer Code for Nanostructure Simulation

    NASA Technical Reports Server (NTRS)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  9. Pseudospark discharges via computer simulation

    SciTech Connect

    Boeuf, J.P.; Pitchford, L.C. )

    1991-04-01

    The authors of this paper developed a hybrid fluid-particle (Monte Carlo) model to describe the initiation phase of pseudospark discharges. In this model, time-dependent fluid equations for the electrons and positive ions are solved self-consistently with Poisson's equation for the electric field in a two-dimensional, cylindrically symmetrical geometry. The Monte Carlo simulation is used to determine the ionization source term in the fluid equations. This model has been used to study the evolution of a discharge in helium at 0.5 torr, with an applied voltage of 2 kV and in a typical pseudospark geometry. From the numerical results, the authors have identified a sequence of physical events that lead to the rapid rise in current associated with the onset of the pseudospark discharge mode. For the conditions the authors have simulated, they find that there is a maximum in the electron multiplication at the time which corresponds to the onset of the hollow cathode effect, and although the multiplication later decreases, it is always greater than needed for a steady-state discharge. Thus the sheaths inside the hollow cathode tend to collapse against the walls, and eventually cathode emission mechanisms (such as field-enhanced thermionic emission) which the authors have not included will start to play a role. In spite of the approximation in this model, the picture which has emerged provides insight into the mechanisms controlling the onset of this potentially important discharge mode.

  10. Rotorcraft Damage Tolerance Evaluated by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Abdi, Frank

    2000-01-01

    An integrally stiffened graphite/epoxy composite rotorcraft structure is evaluated via computational simulation. A computer code that scales up constituent micromechanics level material properties to the structure level and accounts for all possible failure modes is used for the simulation of composite degradation under loading. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulation. Design implications with regard to defect and damage tolerance of integrally stiffened composite structures are examined. A procedure is outlined regarding the use of this type of information for setting quality acceptance criteria, design allowables, damage tolerance, and retirement-for-cause criteria.

  11. Cluster computing software for GATE simulations

    SciTech Connect

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-06-15

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

  12. Cluster computing software for GATE simulations.

    PubMed

    De Beenhouwer, Jan; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R

    2007-06-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values. PMID:17654895

  13. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  14. Polymer Composites Corrosive Degradation: A Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  15. Computer simulations of lung surfactant.

    PubMed

    Baoukina, Svetlana; Tieleman, D Peter

    2016-10-01

    Lung surfactant lines the gas-exchange interface in the lungs and reduces the surface tension, which is necessary for breathing. Lung surfactant consists mainly of lipids with a small amount of proteins and forms a monolayer at the air-water interface connected to bilayer reservoirs. Lung surfactant function involves transfer of material between the monolayer and bilayers during the breathing cycle. Lipids and proteins are organized laterally in the monolayer; selected species are possibly preferentially transferred to bilayers. The complex 3D structure of lung surfactant and the exact roles of lipid organization and proteins remain important goals for research. We review recent simulation studies on the properties of lipid monolayers, monolayers with phase coexistence, monolayer-bilayer transformations, lipid-protein interactions, and effects of nanoparticles on lung surfactant. This article is part of a Special Issue entitled: Biosimulations edited by Ilpo Vattulainen and Tomasz Róg. PMID:26922885

  16. Creating science simulations through Computational Thinking Patterns

    NASA Astrophysics Data System (ADS)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  17. Computer simulation of space station computer steered high gain antenna

    NASA Technical Reports Server (NTRS)

    Beach, S. W.

    1973-01-01

    The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.

  18. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  19. Computational methods for coupling microstructural and micromechanical materials response simulations

    SciTech Connect

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  20. Symbolic computation in system simulation and design

    NASA Astrophysics Data System (ADS)

    Evans, Brian L.; Gu, Steve X.; Kalavade, Asa; Lee, Edward A.

    1995-06-01

    This paper examines some of the roles that symbolic computation plays in assisting system- level simulation and design. By symbolic computation, we mean programs like Mathematica that perform symbolic algebra and apply transformation rules based on algebraic identities. At a behavioral level, symbolic computation can compute parameters, generate new models, and optimize parameter settings. At the synthesis level, symbolic computation can work in tandem with synthesis tools to rewrite cascade and parallel combinations on components in sub- systems to meet design constraints. Symbolic computation represents one type of tool that may be invoked in the complex flow of the system design process. The paper discusses the qualities that a formal infrastructure for managing system design should have. The paper also describes an implementation of this infrastructure called DesignMaker, implemented in the Ptolemy environment, which manages the flow of tool invocations in an efficient manner using a graphical file dependency mechanism.

  1. Computer Series, 108. Computer Simulation of Chemical Equilibrium.

    ERIC Educational Resources Information Center

    Cullen, John F., Jr.

    1989-01-01

    Presented is a computer simulation called "The Great Chemical Bead Game" which can be used to teach the concepts of equilibrium and kinetics to introductory chemistry students more clearly than through an experiment. Discussed are the rules of the game, the application of rate laws and graphical analysis. (CW)

  2. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  3. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  4. Structural Composites Corrosive Management by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  5. Learning features in computer simulation skills training.

    PubMed

    Johannesson, Eva; Olsson, Mats; Petersson, Göran; Silén, Charlotte

    2010-09-01

    New simulation tools imply new opportunities to teach skills and train health care professionals. The aim of this study was to investigate the learning gained from computer simulation skills training. The study was designed for optimal educational settings, which benefit student-centred learning. Twenty-four second year undergraduate nursing students practised intravenous catheterization with the computer simulation program CathSim. Questionnaires were answered before and after the skills training, and after the skills examination. When using CathSim, the students appreciated the variation in patient cases, the immediate feedback, and a better understanding of anatomy, but they missed having an arm model to hold. We concluded that CathSim was useful in the students' learning process and skills training when appropriately integrated into the curriculum. Learning features to be aware of when organizing curricula with simulators are motivation, realism, variation, meaningfulness and feedback. PMID:20015690

  6. Task simulation in computer-based training

    SciTech Connect

    Gardner, P.R.

    1988-02-01

    Westinghouse Hanford Company (WHC) makes extensive use of job-task simulations in company-developed computer-based training (CBT) courseware. This courseware is different from most others because it does not simulate process control machinery or other computer programs, instead the WHC Excerises model day-to-day tasks such as physical work preparations, progress, and incident handling. These Exercises provide a higher level of motivation and enable the testing of more complex patterns of behavior than those typically measured by multiple-choice and short questions. Examples from the WHC Radiation Safety and Crane Safety courses will be used as illustrations. 3 refs.

  7. Student Choices when Learning with Computer Simulations

    NASA Astrophysics Data System (ADS)

    Podolefsky, Noah S.; Adams, Wendy K.; Wieman, Carl E.

    2009-11-01

    We examine student choices while using PhET computer simulations (sims) to learn physics content. In interviews, students were given questions from the Force Concept Inventory (FCI) and were allowed to choose from 12 computer simulations in order to answer these questions. We investigate students' choices when answering FCI questions with sims. We find that while students' initially choose sims that match problem situations at a surface level, deeper connections may be noticed by students later on. These results inform us on how people may choose education resources when learning on their own.

  8. Virtual ambulatory care. Computer simulation applications.

    PubMed

    Zilm, Frank; Culp, Kristyna; Dorney, Beverley

    2003-01-01

    Computer simulation modeling has evolved during the past twenty years into an effective tool for analyzing and planning ambulatory care facilities. This article explains the use of this tool in three case-study, ambulatory care settings--a GI lab, holding beds for a cardiac catheterization laboratory, and in emergency services. These examples also illustrate the use of three software packages currently available: MedModel, Simul8, and WITNESS. PMID:12545512

  9. CHARMM additive and polarizable force fields for biophysics and computer-aided drug design

    PubMed Central

    Vanommeslaeghe, K.

    2014-01-01

    Background Molecular Mechanics (MM) is the method of choice for computational studies of biomolecular systems owing to its modest computational cost, which makes it possible to routinely perform molecular dynamics (MD) simulations on chemical systems of biophysical and biomedical relevance. Scope of Review As one of the main factors limiting the accuracy of MD results is the empirical force field used, the present paper offers a review of recent developments in the CHARMM additive force field, one of the most popular bimolecular force fields. Additionally, we present a detailed discussion of the CHARMM Drude polarizable force field, anticipating a growth in the importance and utilization of polarizable force fields in the near future. Throughout the discussion emphasis is placed on the force fields’ parametrization philosophy and methodology. Major Conclusions Recent improvements in the CHARMM additive force field are mostly related to newly found weaknesses in the previous generation of additive force fields. Beyond the additive approximation is the newly available CHARMM Drude polarizable force field, which allows for MD simulations of up to 1 microsecond on proteins, DNA, lipids and carbohydrates. General Significance Addressing the limitations ensures the reliability of the new CHARMM36 additive force field for the types of calculations that are presently coming into routine computational reach while the availability of the Drude polarizable force fields offers a model that is an inherently more accurate model of the underlying physical forces driving macromolecular structures and dynamics. PMID:25149274

  10. Traffic simulations on parallel computers using domain decomposition techniques

    SciTech Connect

    Hanebutte, U.R.; Tentner, A.M.

    1995-12-31

    Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.

  11. Multiscale simulation process and application to additives in porous composite battery electrodes

    NASA Astrophysics Data System (ADS)

    Wieser, Christian; Prill, Torben; Schladitz, Katja

    2015-03-01

    Structure-resolving simulation of porous materials in electrochemical cells such as fuel cells and lithium ion batteries allows for correlating electrical performance with material morphology. In lithium ion batteries characteristic length scales of active material particles and additives range several orders of magnitude. Hence, providing a computational mesh resolving all length scales is not reasonably feasible and requires alternative approaches. In the work presented here a virtual process to simulate lithium ion batteries by bridging the scales is introduced. Representative lithium ion battery electrode coatings comprised of μm-scale graphite particles as active material and a nm-scale carbon/polymeric binder mixture as an additive are imaged with synchrotron radiation computed tomography (SR-CT) and sequential focused ion beam/scanning electron microscopy (FIB/SEM), respectively. Applying novel image processing methodologies for the FIB/SEM images, data sets are binarized to provide a computational grid for calculating the effective mass transport properties of the electrolyte phase in the nanoporous additive. Afterwards, the homogenized additive is virtually added to the micropores of the binarized SR-CT data set representing the active particle structure, and the resulting electrode structure is assembled to a virtual half-cell for electrochemical microheterogeneous simulation. Preliminary battery performance simulations indicate non-negligible impact of the consideration of the additive.

  12. Perspective: Computer simulations of long time dynamics.

    PubMed

    Elber, Ron

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

  13. Perspective: Computer simulations of long time dynamics

    PubMed Central

    Elber, Ron

    2016-01-01

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

  14. Uncertainty and error in computational simulations

    SciTech Connect

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  15. Simulating physical phenomena with a quantum computer

    NASA Astrophysics Data System (ADS)

    Ortiz, Gerardo

    2003-03-01

    In a keynote speech at MIT in 1981 Richard Feynman raised some provocative questions in connection to the exact simulation of physical systems using a special device named a ``quantum computer'' (QC). At the time it was known that deterministic simulations of quantum phenomena in classical computers required a number of resources that scaled exponentially with the number of degrees of freedom, and also that the probabilistic simulation of certain quantum problems were limited by the so-called sign or phase problem, a problem believed to be of exponential complexity. Such a QC was intended to mimick physical processes exactly the same as Nature. Certainly, remarks coming from such an influential figure generated widespread interest in these ideas, and today after 21 years there are still some open questions. What kind of physical phenomena can be simulated with a QC?, How?, and What are its limitations? Addressing and attempting to answer these questions is what this talk is about. Definitively, the goal of physics simulation using controllable quantum systems (``physics imitation'') is to exploit quantum laws to advantage, and thus accomplish efficient imitation. Fundamental is the connection between a quantum computational model and a physical system by transformations of operator algebras. This concept is a necessary one because in Quantum Mechanics each physical system is naturally associated with a language of operators and thus can be considered as a possible model of quantum computation. The remarkable result is that an arbitrary physical system is naturally simulatable by another physical system (or QC) whenever a ``dictionary'' between the two operator algebras exists. I will explain these concepts and address some of Feynman's concerns regarding the simulation of fermionic systems. Finally, I will illustrate the main ideas by imitating simple physical phenomena borrowed from condensed matter physics using quantum algorithms, and present experimental

  16. Quantitative computer simulations of extraterrestrial processing operations

    NASA Technical Reports Server (NTRS)

    Vincent, T. L.; Nikravesh, P. E.

    1989-01-01

    The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

  17. Computer simulation of screw dislocation in aluminum

    NASA Technical Reports Server (NTRS)

    Esterling, D. M.

    1976-01-01

    The atomic structure in a 110 screw dislocation core for aluminum is obtained by computer simulation. The lattice statics technique is employed since it entails no artificially imposed elastic boundary around the defect. The interatomic potential has no adjustable parameters and was derived from pseudopotential theory. The resulting atomic displacements were allowed to relax in all three dimensions.

  18. Eliminating Computational Instability In Multibody Simulations

    NASA Technical Reports Server (NTRS)

    Watts, Gaines L.

    1994-01-01

    TWOBODY implements improved version of Lagrange multiplier method. Program ultilizes programming technique eliminating computational instability in multibody simulations in which Lagrange multipliers used. In technique, one uses constraint equations, instead of integration, to determine coordinates that are not independent. To illustrate technique, it includes simple mathematical model of solid rocket booster and parachute connected by frictionless swivel. Written in FORTRAN 77.

  19. Macromod: Computer Simulation For Introductory Economics

    ERIC Educational Resources Information Center

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  20. Designing Online Scaffolds for Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

    2013-01-01

    The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high…

  1. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  2. Decision Making in Computer-Simulated Experiments.

    ERIC Educational Resources Information Center

    Suits, J. P.; Lagowski, J. J.

    A set of interactive, computer-simulated experiments was designed to respond to the large range of individual differences in aptitude and reasoning ability generally exhibited by students enrolled in first-semester general chemistry. These experiments give students direct experience in the type of decision making needed in an experimental setting.…

  3. Progress in Computational Simulation of Earthquakes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

    2006-01-01

    GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

  4. Computation applied to particle accelerator simulations

    SciTech Connect

    Herrmannsfeldt, W.B. ); Yan, Y.T. )

    1991-07-01

    The rapid growth in the power of large-scale computers has had a revolutionary effect on the study of charged-particle accelerators that is similar to the impact of smaller computers on everyday life. Before an accelerator is built, it is now the absolute rule to simulate every component and subsystem by computer to establish modes of operation and tolerances. We will bypass the important and fruitful areas of control and operation and consider only application to design and diagnostic interpretation. Applications of computers can be divided into separate categories including: component design, system design, stability studies, cost optimization, and operating condition simulation. For the purposes of this report, we will choose a few examples taken from the above categories to illustrate the methods and we will discuss the significance of the work to the project, and also briefly discuss the accelerator project itself. The examples that will be discussed are: (1) the tracking analysis done for the main ring of the Superconducting Supercollider, which contributed to the analysis which ultimately resulted in changing the dipole coil diameter to 5 cm from the earlier design for a 4-cm coil-diameter dipole magnet; (2) the design of accelerator structures for electron-positron linear colliders and circular colliding beam systems (B-factories); (3) simulation of the wake fields from multibunch electron beams for linear colliders; and (4) particle-in-cell simulation of space-charge dominated beams for an experimental liner induction accelerator for Heavy Ion Fusion. 8 refs., 9 figs.

  5. Factors Promoting Engaged Exploration with Computer Simulations

    ERIC Educational Resources Information Center

    Podolefsky, Noah S.; Perkins, Katherine K.; Adams, Wendy K.

    2010-01-01

    This paper extends prior research on student use of computer simulations (sims) to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze…

  6. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models. PMID:26336695

  7. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  8. Spiking network simulation code for petascale computers

    PubMed Central

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  9. Computer simulations of WIGWAM underwater experiment

    SciTech Connect

    Kamegai, Minao; White, J.W.

    1993-11-01

    We performed computer simulations of the WIGWAM underwater experiment with a 2-D hydro-code, CALE. First, we calculated the bubble pulse and the signal strength at the closest gauge in one-dimensional geometry. The calculation shows excellent agreement with the measured data. Next, we made two-dimensional simulations of WIGWAM applying the gravity over-pressure, and calculated the signals at three selected gauge locations where measurements were recorded. The computed peak pressures at those gauge locations come well within the 15% experimental error bars. The signal at the farthest gauge is of the order of 200 bars. This is significant, because at this pressure the CALE output can be linked to a hydro-acoustics computer program, NPE Code (Nonlinear Progressive Wave-equation Code), to analyze the long distance propagation of acoustical signals from the underwater explosions on a global scale.

  10. Computer simulation of underwater nuclear effects

    SciTech Connect

    Kamegai, M.

    1987-01-30

    We investigated underwater nuclear effects by computer simulations. First, we computed a long distance wave propagation in water by the 1-D LASNEX code by modeling the energy source and the underwater environment. The pressure-distance data were calculated for two quite different yields; pressures range from 300 GPa to 15 MPa. They were found to be in good agreement with Snay's theoretical points and the Wigwam measurements. The computed data also agree with the similarity solution at high pressures and the empirical equation at low pressures. After completion of the 1-D study, we investigated a free surface effect commonly referred to as irregular surface rarefaction by applying two hydrocodes (LASNEX and ALE), linked at the appropriate time. Using these codes, we simulated near-surface explosions for three depths of burst (3 m, 21 m and 66.5 m), which represent the strong, intermediate, and weak surface shocks, respectively.

  11. Computational algorithms for simulations in atmospheric optics.

    PubMed

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors. PMID:27140113

  12. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1983-01-01

    Adequate computer methods, based on interactions between discrete particles, provide information leading to an atomic level understanding of various physical processes. The success of these simulation methods, however, is related to the accuracy of the potential energy function representing the interactions among the particles. The development of a potential energy function for crystalline SiO2 forms that can be employed in lengthy computer modelling procedures was investigated. In many of the simulation methods which deal with discrete particles, semiempirical two body potentials were employed to analyze energy and structure related properties of the system. Many body interactions are required for a proper representation of the total energy for many systems. Many body interactions for simulations based on discrete particles are discussed.

  13. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    2004-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  14. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2005-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  15. Additional support for the TDK/MABL computer program

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dunn, Stuart S.

    1993-01-01

    An advanced version of the Two-Dimensional Kinetics (TDK) computer program was developed under contract and released to the propulsion community in early 1989. Exposure of the code to this community indicated a need for improvements in certain areas. In particular, the TDK code needed to be adapted to the special requirements imposed by the Space Transportation Main Engine (STME) development program. This engine utilizes injection of the gas generator exhaust into the primary nozzle by means of a set of slots. The subsequent mixing of this secondary stream with the primary stream with finite rate chemical reaction can have a major impact on the engine performance and the thermal protection of the nozzle wall. In attempting to calculate this reacting boundary layer problem, the Mass Addition Boundary Layer (MABL) module of TDK was found to be deficient in several respects. For example, when finite rate chemistry was used to determine gas properties, (MABL-K option) the program run times became excessive because extremely small step sizes were required to maintain numerical stability. A robust solution algorithm was required so that the MABL-K option could be viable as a rocket propulsion industry design tool. Solving this problem was a primary goal of the phase 1 work effort.

  16. X-ray computed tomography for additive manufacturing: a review

    NASA Astrophysics Data System (ADS)

    Thompson, A.; Maskery, I.; Leach, R. K.

    2016-07-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

  17. Computer simulation of underwater nuclear events

    SciTech Connect

    Kamegai, M.

    1986-09-01

    This report describes the computer simulation of two underwater nuclear explosions, Operation Wigwam and a modern hypothetical explosion of greater yield. The computer simulations were done in spherical geometry with the LASNEX computer code. Comparison of the LASNEX calculation with Snay's analytical results and the Wigwam measurements shows that agreement in the shock pressure versus range in water is better than 5%. The results of the calculations are also consistent with the cube root scaling law for an underwater blast wave. The time constant of the wave front was determined from the wave profiles taken at several points. The LASNEX time-constant calculation and Snay's theoretical results agree to within 20%. A time-constant-versus-range relation empirically fitted by Snay is valid only within a limited range at low pressures, whereas a time-constant formula based on Sedov's similarity solution holds at very high pressures. This leaves the intermediate pressure range with neither an empirical nor a theoretical formula for the time constant. These one-dimensional simulations demonstrate applicability of the computer code to investigations of this nature, and justify the use of this technique for more complex two-dimensional problems, namely, surface effects on underwater nuclear explosions. 16 refs., 8 figs., 2 tabs.

  18. Computational Challenges in Nuclear Weapons Simulation

    SciTech Connect

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  19. Application of computer simulators in population genetics.

    PubMed

    Feng, Gao; Haipeng, Li

    2016-08-01

    The genomes of more and more organisms have been sequenced due to the advances in next-generation sequencing technologies. As a powerful tool, computer simulators play a critical role in studying the genome-wide DNA polymorphism pattern. Simulations can be performed both forwards-in-time and backwards-in-time, which complement each other and are suitable for meeting different needs, such as studying the effect of evolutionary dynamics, the estimation of parameters, and the validation of evolutionary hypotheses as well as new methods. In this review, we briefly introduced population genetics related theoretical framework and provided a detailed comparison of 32 simulators published over the last ten years. The future development of new simulators was also discussed. PMID:27531609

  20. Computational simulation of Faraday probe measurements

    NASA Astrophysics Data System (ADS)

    Boerner, Jeremiah J.

    Electric propulsion devices, including ion thrusters and Hall thrusters, are becoming increasingly popular for long duration space missions. Ground-based experimental testing of such devices is performed in vacuum chambers, which develop an unavoidable background gas due to pumping limitations and facility leakage. Besides directly altering the operating environment, the background gas may indirectly affect the performance of immersed plasma probe diagnostics. This work focuses on computational modeling research conducted to evaluate the performance of a current-collecting Faraday probe. Initial findings from one dimensional analytical models of plasma sheaths are used as reference cases for subsequent modeling. A two dimensional, axisymmetric, hybrid electron fluid and Particle In Cell computational code is used for extensive simulation of the plasma flow around a representative Faraday probe geometry. The hybrid fluid PIC code is used to simulate a range of inflowing plasma conditions, from a simple ion beam consistent with one dimensional models to a multiple component plasma representative of a low-power Hall thruster plume. These simulations produce profiles of plasma properties and simulated current measurements at the probe surface. Interpretation of the simulation results leads to recommendations for probe design and experimental techniques. Significant contributions of this work include the development and use of two new non-neutral detailed electron fluid models and the recent incorporation of multi grid capabilities.

  1. Memory interface simulator: A computer design aid

    NASA Technical Reports Server (NTRS)

    Taylor, D. S.; Williams, T.; Weatherbee, J. E.

    1972-01-01

    Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.

  2. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  3. Computer Simulation For Design Of TWT's

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard

    1992-01-01

    A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.

  4. Integrated computer simulation on FIR FEL dynamics

    SciTech Connect

    Furukawa, H.; Kuruma, S.; Imasaki, K.

    1995-12-31

    An integrated computer simulation code has been developed to analyze the RF-Linac FEL dynamics. First, the simulation code on the electron beam acceleration and transport processes in RF-Linac: (LUNA) has been developed to analyze the characteristics of the electron beam in RF-Linac and to optimize the parameters of RF-Linac. Second, a space-time dependent 3D FEL simulation code (Shipout) has been developed. The RF-Linac FEL total simulations have been performed by using the electron beam data from LUNA in Shipout. The number of particles using in a RF-Linac FEL total simulation is approximately 1000. The CPU time for the simulation of 1 round trip is about 1.5 minutes. At ILT/ILE, Osaka, a 8.5MeV RF-Linac with a photo-cathode RF-gun is used for FEL oscillation experiments. By using 2 cm wiggler, the FEL oscillation in the wavelength approximately 46 {mu}m are investigated. By the simulations using LUNA with the parameters of an ILT/ILE experiment, the pulse shape and the energy spectra of the electron beam at the end of the linac are estimated. The pulse shape of the electron beam at the end of the linac has sharp rise-up and it slowly decays as a function of time. By the RF-linac FEL total simulations with the parameters of an ILT/ILE experiment, the dependencies of the start up of the FEL oscillations on the pulse shape of the electron beam at the end of the linac are estimated. The coherent spontaneous emission effects and the quick start up of FEL oscillations have been observed by the RF-Linac FEL total simulations.

  5. Computer simulations in the science classroom

    NASA Astrophysics Data System (ADS)

    Richards, John; Barowy, William; Levin, Dov

    1992-03-01

    In this paper we describe software for science instruction that is based upon a constructivist epistemology of learning. From a constructivist perspective, the process of learning is viewed as an active construction of knowledge, rather than a passive reception of information. The computer has the potential to provide an environment in which students can explore their understanding and better construct scientific knowledge. The Explorer is an interactive environment that integrates animated computer models with analytic capabilities for learning and teaching science. The system include graphs, a spreadsheet, scripting, and interactive tools. During formative evaluation of Explorer in the classroom, we have focused on learning the function and effectiveness of computer models in teaching science. Models have helped students relate theory to experiment when used in conjunction with hands-on activities and when the simulation addressed students' naive understanding of the phenomena. Two classroom examples illustrate our findings. The first is based on the dynamics of colliding objects. The second describes a class modeling the function of simple electric circuits. The simulations bridge between phenomena and theory by providing an abstract representation on which students may make measurements. Simulations based on scientific theory help to provide a set of interrelated experiences that challenge students' informal understanding of the science.

  6. Metal matrix composites microfracture: Computational simulation

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Caruso, John J.; Chamis, Christos C.

    1990-01-01

    Fiber/matrix fracture and fiber-matrix interface debonding in a metal matrix composite (MMC) are computationally simulated. These simulations are part of a research activity to develop computational methods for microfracture, microfracture propagation and fracture toughness of the metal matrix composites. The three-dimensional finite element model used in the simulation consists of a group of nine unidirectional fibers in three by three unit cell array of SiC/Ti15 metal matrix composite with a fiber volume ration of 0.35. This computational procedure is used to predict the fracture process and establish the hierarchy of fracture modes based on strain energy release rate. It is also used to predict stress redistribution to surrounding matrix-fibers due to initial and progressive fracture of fiber/matrix and due to debonding of fiber-matrix interface. Microfracture results for various loading cases such as longitudinal, transverse, shear and bending are presented and discussed. Step-by-step procedures are outlined to evaluate composite microfracture for a given composite system.

  7. Additional Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.

    2013-01-01

    NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM)

  8. Additional Developments in Atmosphere Revitalization Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Coker, Robert F.; Knox, James C.; Cummings, Ramona; Brooks, Thomas; Schunk, Richard G.; Gomez, Carlos

    2013-01-01

    NASA's Advanced Exploration Systems (AES) program is developing prototype systems, demonstrating key capabilities, and validating operational concepts for future human missions beyond Earth orbit. These forays beyond the confines of earth's gravity will place unprecedented demands on launch systems. They must launch the supplies needed to sustain a crew over longer periods for exploration missions beyond earth's moon. Thus all spacecraft systems, including those for the separation of metabolic carbon dioxide and water from a crewed vehicle, must be minimized with respect to mass, power, and volume. Emphasis is also placed on system robustness both to minimize replacement parts and ensure crew safety when a quick return to earth is not possible. Current efforts are focused on improving the current state-of-the-art systems utilizing fixed beds of sorbent pellets by evaluating structured sorbents, seeking more robust pelletized sorbents, and examining alternate bed configurations to improve system efficiency and reliability. These development efforts combine testing of sub-scale systems and multi-physics computer simulations to evaluate candidate approaches, select the best performing options, and optimize the configuration of the selected approach. This paper describes the continuing development of atmosphere revitalization models and simulations in support of the Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project within the AES program.

  9. Accelerating Climate Simulations Through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  10. Computer model to simulate testing at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.

    1995-01-01

    A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.

  11. Computational aeroacoustics and numerical simulation of supersonic jets

    NASA Technical Reports Server (NTRS)

    Morris, Philip J.; Long, Lyle N.

    1996-01-01

    The research project has been a computational study of computational aeroacoustics algorithms and numerical simulations of the flow and noise of supersonic jets. During this study a new method for the implementation of solid wall boundary conditions for complex geometries in three dimensions has been developed. In addition, a detailed study of the simulation of the flow in and noise from supersonic circular and rectangular jets has been conducted. Extensive comparisons have been made with experimental measurements. A summary of the results of the research program are attached as the main body of this report in the form of two publications. Also, the report lists the names of the students who were supported by this grant, their degrees, and the titles of their dissertations. In addition, a list of presentations and publications made by the Principal Investigators and the research students is also included.

  12. New Computer Simulations of Macular Neural Functioning

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  13. Computational Methods for Jet Noise Simulation

    NASA Technical Reports Server (NTRS)

    Goodrich, John W. (Technical Monitor); Hagstrom, Thomas

    2003-01-01

    The purpose of our project is to develop, analyze, and test novel numerical technologies central to the long term goal of direct simulations of subsonic jet noise. Our current focus is on two issues: accurate, near-field domain truncations and high-order, single-step discretizations of the governing equations. The Direct Numerical Simulation (DNS) of jet noise poses a number of extreme challenges to computational technique. In particular, the problem involves multiple temporal and spatial scales as well as flow instabilities and is posed on an unbounded spatial domain. Moreover, the basic phenomenon of interest, the radiation of acoustic waves to the far field, involves only a minuscule fraction of the total energy. The best current simulations of jet noise are at low Reynolds number. It is likely that an increase of one to two orders of magnitude will be necessary to reach a regime where the separation between the energy-containing and dissipation scales is sufficient to make the radiated noise essentially independent of the Reynolds number. Such an increase in resolution cannot be obtained in the near future solely through increases in computing power. Therefore, new numerical methodologies of maximal efficiency and accuracy are required.

  14. Fast computation algorithms for speckle pattern simulation

    SciTech Connect

    Nascov, Victor; Samoilă, Cornel; Ursuţiu, Doru

    2013-11-13

    We present our development of a series of efficient computation algorithms, generally usable to calculate light diffraction and particularly for speckle pattern simulation. We use mainly the scalar diffraction theory in the form of Rayleigh-Sommerfeld diffraction formula and its Fresnel approximation. Our algorithms are based on a special form of the convolution theorem and the Fast Fourier Transform. They are able to evaluate the diffraction formula much faster than by direct computation and we have circumvented the restrictions regarding the relative sizes of the input and output domains, met on commonly used procedures. Moreover, the input and output planes can be tilted each to other and the output domain can be off-axis shifted.

  15. Investigation of Carbohydrate Recognition via Computer Simulation

    SciTech Connect

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  16. Computer simulation of spacecraft/environment interaction.

    PubMed

    Krupnikov, K K; Makletsov, A A; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-10-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991 1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language. PMID:11542669

  17. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE PAGESBeta

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  18. Molecular physiology of rhodopsin: Computer simulation

    NASA Astrophysics Data System (ADS)

    Fel'Dman, T. B.; Kholmurodov, Kh. T.; Ostrovsky, M. A.

    2008-03-01

    Computer simulation is used for comparative investigation of the molecular dynamics of rhodopsin containing the chromophore group (11- cis-retinal) and free opsin. Molecular dynamics is traced within a time interval of 3000 ps; 3 × 106 discrete conformational states of rhodopsin and opsin are obtained and analyzed. It is demonstrated that the presence of the chromophore group in the chromophore center of opsin influences considerably the nearest protein environment of 11- cis-retinal both in the region of the β-ionone ring and in the region of the protonated Schiff base bond. Based on simulation results, a possible intramolecular mechanism of keeping rhodopsin as a G-protein-coupled receptor in the inactive state, i.e., the chromophore function as an efficient ligand antagonist, is discussed.

  19. Computer Simulation Studies of Gramicidin Channel

    NASA Astrophysics Data System (ADS)

    Song, Hyundeok; Beck, Thomas

    2009-04-01

    Ion channels are large membrane proteins, and their function is to facilitate the passage of ions across biological membranes. Recently, Dr. John Cuppoletti's group at UC showed that the gramicidin channel could function at high temperatures (360 -- 390K) with significant currents. This finding may have large implications for fuel cell technology. In this project, we will examine the experimental system by computer simulation. We will investigate how the temperature affects the current and differences in magnitude of the currents between two forms of Gramicidin, A and D. This research will help to elucidate the underlying molecular mechanism in this promising new technology.

  20. Determining Peptide Partitioning Properties via Computer Simulation

    PubMed Central

    Ulmschneider, Jakob P.; Ulmschneider, Martin B.

    2010-01-01

    The transfer of polypeptide segments into lipid bilayers to form transmembrane helices represents the crucial first step in cellular membrane protein folding and assembly. This process is driven by complex and poorly understood atomic interactions of peptides with the lipid bilayer environment. The lack of suitable experimental techniques that can resolve these processes both at atomic resolution and nanosecond timescales has spurred the development of computational techniques. In this review, we summarize the significant progress achieved in the last few years in elucidating the partitioning of peptides into lipid bilayer membranes using atomic detail molecular dynamics simulations. Indeed, partitioning simulations can now provide a wealth of structural and dynamic information. Furthermore, we show that peptide-induced bilayer distortions, insertion pathways, transfer free energies, and kinetic insertion barriers are now accurate enough to complement experiments. Further advances in simulation methods and force field parameter accuracy promise to turn molecular dynamics simulations into a powerful tool for investigating a wide range of membrane active peptide phenomena. PMID:21107546

  1. Computer Simulation of Fracture in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2006-01-01

    Aerogels are of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While the gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. In this work, we investigate the strength and fracture behavior of silica aerogels using a molecular statics-based computer simulation technique. The gels' structure is simulated via a Diffusion Limited Cluster Aggregation (DLCA) algorithm, which produces fractal structures representing experimentally observed aggregates of so-called secondary particles, themselves composed of amorphous silica primary particles an order of magnitude smaller. We have performed multi-length-scale simulations of fracture in silica aerogels, in which the interaction b e e n two secondary particles is assumed to be described by a Morse pair potential parameterized such that the potential range is much smaller than the secondary particle size. These Morse parameters are obtained by atomistic simulation of models of the experimentally-observed amorphous silica "bridges," with the fracture behavior of these bridges modeled via molecular statics using a Morse/Coulomb potential for silica. We consider the energetics of the fracture, and compare qualitative features of low-and high-density gel fracture.

  2. Computational model for protein unfolding simulation

    NASA Astrophysics Data System (ADS)

    Tian, Xu-Hong; Zheng, Ye-Han; Jiao, Xiong; Liu, Cai-Xing; Chang, Shan

    2011-06-01

    The protein folding problem is one of the fundamental and important questions in molecular biology. However, the all-atom molecular dynamics studies of protein folding and unfolding are still computationally expensive and severely limited by the time scale of simulation. In this paper, a simple and fast protein unfolding method is proposed based on the conformational stability analyses and structure modeling. In this method, two structure-based conditions are considered to identify the unstable regions of proteins during the unfolding processes. The protein unfolding trajectories are mimicked through iterative structure modeling according to conformational stability analyses. Two proteins, chymotrypsin inhibitor 2 (CI2) and α -spectrin SH3 domain (SH3) were simulated by this method. Their unfolding pathways are consistent with the previous molecular dynamics simulations. Furthermore, the transition states of the two proteins were identified in unfolding processes and the theoretical Φ values of these transition states showed significant correlations with the experimental data (the correlation coefficients are >0.8). The results indicate that this method is effective in studying protein unfolding. Moreover, we analyzed and discussed the influence of parameters on the unfolding simulation. This simple coarse-grained model may provide a general and fast approach for the mechanism studies of protein folding.

  3. Computer simulation of metal-organic materials

    NASA Astrophysics Data System (ADS)

    Stern, Abraham C.

    Computer simulations of metal-organic frameworks are conducted to both investigate the mechanism of hydrogen sorption and to elucidate a detailed, molecular-level understanding of the physical interactions that can lead to successful material design strategies. To this end, important intermolecular interactions are identified and individually parameterized to yield a highly accurate representation of the potential energy landscape. Polarization, one such interaction found to play a significant role in H 2 sorption, is included explicitly for the first time in simulations of metal-organic frameworks. Permanent electrostatics are usually accounted for by means of an approximate fit to model compounds. The application of this method to simulations involving metal-organic frameworks introduces several substantial problems that are characterized in this work. To circumvent this, a method is developed and tested in which atomic point partial charges are computed more directly, fit to the fully periodic electrostatic potential. In this manner, long-range electrostatics are explicitly accounted for via Ewald summation. Grand canonical Monte Carlo simulations are conducted employing the force field parameterization developed here. Several of the major findings of this work are: Polarization is found to play a critical role in determining the overall structure of H2 sorbed in metal-organic frameworks, although not always the determining factor in uptake. The parameterization of atomic point charges by means of a fit to the periodic electrostatic potential is a robust, efficient method and consistently results in a reliable description of Coulombic interactions without introducing ambiguity associated with other procedures. After careful development of both hydrogen and framework potential energy functions, quantitatively accurate results have been obtained. Such predictive accuracy will aid greatly in the rational, iterative design cycle between experimental and theoretical

  4. Computer simulation of solder joint failure

    SciTech Connect

    Burchett, S.N.; Frear, D.R.; Rashid, M.M.

    1997-04-01

    The thermomechanical fatigue failure of solder joints is increasingly becoming an important reliability issue for electronic packages. The purpose of this Laboratory Directed Research and Development (LDRD) project was to develop computational tools for simulating the behavior of solder joints under strain and temperature cycling, taking into account the microstructural heterogeneities that exist in as-solidified near eutectic Sn-Pb joints, as well as subsequent microstructural evolution. The authors present two computational constitutive models, a two-phase model and a single-phase model, that were developed to predict the behavior of near eutectic Sn-Pb solder joints under fatigue conditions. Unique metallurgical tests provide the fundamental input for the constitutive relations. The two-phase model mathematically predicts the heterogeneous coarsening behavior of near eutectic Sn-Pb solder. The finite element simulations with this model agree qualitatively with experimental thermomechanical fatigue tests. The simulations show that the presence of an initial heterogeneity in the solder microstructure could significantly degrade the fatigue lifetime. The single-phase model was developed to predict solder joint behavior using materials data for constitutive relation constants that could be determined through straightforward metallurgical experiments. Special thermomechanical fatigue tests were developed to give fundamental materials input to the models, and an in situ SEM thermomechanical fatigue test system was developed to characterize microstructural evolution and the mechanical behavior of solder joints during the test. A shear/torsion test sample was developed to impose strain in two different orientations. Materials constants were derived from these tests. The simulation results from the two-phase model showed good fit to the experimental test results.

  5. A Generic Scheduling Simulator for High Performance Parallel Computers

    SciTech Connect

    Yoo, B S; Choi, G S; Jette, M A

    2001-08-01

    It is well known that efficient job scheduling plays a crucial role in achieving high system utilization in large-scale high performance computing environments. A good scheduling algorithm should schedule jobs to achieve high system utilization while satisfying various user demands in an equitable fashion. Designing such a scheduling algorithm is a non-trivial task even in a static environment. In practice, the computing environment and workload are constantly changing. There are several reasons for this. First, the computing platforms constantly evolve as the technology advances. For example, the availability of relatively powerful commodity off-the-shelf (COTS) components at steadily diminishing prices have made it feasible to construct ever larger massively parallel computers in recent years [1, 4]. Second, the workload imposed on the system also changes constantly. The rapidly increasing compute resources have provided many applications developers with the opportunity to radically alter program characteristics and take advantage of these additional resources. New developments in software technology may also trigger changes in user applications. Finally, political climate change may alter user priorities or the mission of the organization. System designers in such dynamic environments must be able to accurately forecast the effect of changes in the hardware, software, and/or policies under consideration. If the environmental changes are significant, one must also reassess scheduling algorithms. Simulation has frequently been relied upon for this analysis, because other methods such as analytical modeling or actual measurements are usually too difficult or costly. A drawback of the simulation approach, however, is that developing a simulator is a time-consuming process. Furthermore, an existing simulator cannot be easily adapted to a new environment. In this research, we attempt to develop a generic job-scheduling simulator, which facilitates the evaluation of

  6. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    NASA Astrophysics Data System (ADS)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  7. Computational Fluid Dynamics (CFD) simulation of the Madison Dynamo Experiment.

    NASA Astrophysics Data System (ADS)

    Haehn, N. S.; Forest, C. B.; Weber, C. R.; Kendrick, R. D.; Taylor, N. Z.; Oakley, J. G.; Bonazza, R.; Spence, Erik

    2007-11-01

    The Madison Dynamo Experiment is designed to study a self-generated magnetic field called a dynamo. The flow characteristics of a water experiment that is dimensionally similar to the liquid sodium experiment has been modeled using the Computational Fluid Dynamics (CFD) software Fluent. Results from the CFD simulations are used to confirm flow characteristics measured experimentally by both Laser Doppler Velocimetry (LDV) and Particle Imaging Velocimetry (PIV). Simulations can also give insight into the flow characteristics in regions of the experiment which are not accessible via the LDV and PIV systems. The results from the simulations are also used as input for a MHD code to predict the threshold for Dynamo onset. The CFD simulations -- in conjunction with the MHD dynamo prediction code -- can be used to design modifications to the experiment to minimize costly changes. The CFD code has shown that the addition of an equatorial baffle along with several poloidal baffles can lower the threshold for Dynamo onset.

  8. Computer Simulations in Science Education: Implications for Distance Education

    ERIC Educational Resources Information Center

    Sahin, Sami

    2006-01-01

    This paper is a review of literature about the use of computer simulations in science education. This review examines types and examples of computer simulations. The literature review indicated that although computer simulations cannot replace science classroom and laboratory activities completely, they offer various advantages both for classroom…

  9. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  10. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  11. Neural network computer simulation of medical aerosols.

    PubMed

    Richardson, C J; Barlow, D J

    1996-06-01

    Preliminary investigations have been conducted to assess the potential for using artificial neural networks to simulate aerosol behaviour, with a view to employing this type of methodology in the evaluation and design of pulmonary drug-delivery systems. Details are presented of the general purpose software developed for these tasks; it implements a feed-forward back-propagation algorithm with weight decay and connection pruning, the user having complete run-time control of the network architecture and mode of training. A series of exploratory investigations is then reported in which different network structures and training strategies are assessed in terms of their ability to simulate known patterns of fluid flow in simple model systems. The first of these involves simulations of cellular automata-generated data for fluid flow through a partially obstructed two-dimensional pipe. The artificial neural networks are shown to be highly successful in simulating the behaviour of this simple linear system, but with important provisos relating to the information content of the training data and the criteria used to judge when the network is properly trained. A second set of investigations is then reported in which similar networks are used to simulate patterns of fluid flow through aerosol generation devices, using training data furnished through rigorous computational fluid dynamics modelling. These more complex three-dimensional systems are modelled with equal success. It is concluded that carefully tailored, well trained networks could provide valuable tools not just for predicting but also for analysing the spatial dynamics of pharmaceutical aerosols. PMID:8832491

  12. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  13. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  14. Ku-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Magnusson, H. G.; Goff, M. F.

    1984-01-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  15. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1984-01-01

    All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.

  16. Computer simulation of fatigue under diametrical compression

    SciTech Connect

    Carmona, H. A.; Kun, F.; Andrade, J. S. Jr.; Herrmann, H. J.

    2007-04-15

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.

  17. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  18. Adv. Simulation for Additive Manufacturing: 11/2014 Wkshp. Report for U.S. DOE/EERE/AMO

    SciTech Connect

    Turner, John A.; Babu, Sudarsanam Suresh; Blue, Craig A.

    2015-07-01

    The overarching question for the workshop was as following: How do we best utilize advanced modeling and high-performance computing (HPC) to address key challenges and opportunities in order to realize the full potential of additive manufacturing; and what are the key challenges of additive manufacturing to which modeling and simulation can contribute solutions, and what will it take to meet these challenges?

  19. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  20. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1983-01-01

    Chip level modeling techniques, functional fault simulation, simulation software development, a more efficient, high level version of GSP, and a parallel architecture for functional simulation are discussed.

  1. Simulation of Powder Layer Deposition in Additive Manufacturing Processes Using the Discrete Element Method

    SciTech Connect

    Herbold, E. B.; Walton, O.; Homel, M. A.

    2015-10-26

    This document serves as a final report to a small effort where several improvements were added to a LLNL code GEODYN-­L to develop Discrete Element Method (DEM) algorithms coupled to Lagrangian Finite Element (FE) solvers to investigate powder-­bed formation problems for additive manufacturing. The results from these simulations will be assessed for inclusion as the initial conditions for Direct Metal Laser Sintering (DMLS) simulations performed with ALE3D. The algorithms were written and performed on parallel computing platforms at LLNL. The total funding level was 3-­4 weeks of an FTE split amongst two staff scientists and one post-­doc. The DEM simulations emulated, as much as was feasible, the physical process of depositing a new layer of powder over a bed of existing powder. The DEM simulations utilized truncated size distributions spanning realistic size ranges with a size distribution profile consistent with realistic sample set. A minimum simulation sample size on the order of 40-­particles square by 10-­particles deep was utilized in these scoping studies in order to evaluate the potential effects of size segregation variation with distance displaced in front of a screed blade. A reasonable method for evaluating the problem was developed and validated. Several simulations were performed to show the viability of the approach. Future investigations will focus on running various simulations investigating powder particle sizing and screen geometries.

  2. Space radiator simulation manual for computer code

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.

  3. Computational simulation of hot composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Singhal, S. N.

    1991-01-01

    Three different computer codes developed in-house are described for application to hot composite structures. These codes include capabilities for: (1) laminate behavior (METCAN); (2) thermal/structural analysis of hot structures made from high temperature metal matrix composites (HITCAN); and (3) laminate tailoring (MMLT). Results for select sample cases are described to demonstrate the versatility as well as the application of these codes to specific situations. The sample case results show that METCAN can be used to simulate cyclic life in high temperature metal matrix composites; HITCAN can be used to evaluate the structural performance of curved panels as well as respective sensitivities of various nonlinearities, and MMLT can be used to tailor the fabrication process in order to reduce residual stresses in the matrix upon cool-down.

  4. Computational simulation of hot composites structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Murthy, P. L. N.; Singhal, S. N.

    1991-01-01

    Three different computer codes developed in-house are described for application to hot composite structures. These codes include capabilities for: (1) laminate behavior (METCAN); (2) thermal/structural analysis of hot structures made from high temperature metal matrix composites (HITCAN); and (3) laminate tailoring (MMLT). Results for select sample cases are described to demonstrate the versatility as well as the application of these codes to specific situations. The sample case results show that METCAN can be used to simulate cyclic life in high temperature metal matrix composites; HITCAN can be used to evaluate the structural performance of curved panels as well as respective sensitivities of various nonlinearities, and MMLT can be used to tailor the fabrication process in order to reduce residual stresses in the matrix upon cool-down.

  5. Miller experiments in atomistic computer simulations

    PubMed Central

    Saitta, Antonino Marco; Saija, Franz

    2014-01-01

    The celebrated Miller experiments reported on the spontaneous formation of amino acids from a mixture of simple molecules reacting under an electric discharge, giving birth to the research field of prebiotic chemistry. However, the chemical reactions involved in those experiments have never been studied at the atomic level. Here we report on, to our knowledge, the first ab initio computer simulations of Miller-like experiments in the condensed phase. Our study, based on the recent method of treatment of aqueous systems under electric fields and on metadynamics analysis of chemical reactions, shows that glycine spontaneously forms from mixtures of simple molecules once an electric field is switched on and identifies formic acid and formamide as key intermediate products of the early steps of the Miller reactions, and the crucible of formation of complex biological molecules. PMID:25201948

  6. A Mass Spectrometer Simulator in Your Computer

    NASA Astrophysics Data System (ADS)

    Gagnon, Michel

    2012-12-01

    Introduced to study components of ionized gas, the mass spectrometer has evolved into a highly accurate device now used in many undergraduate and research laboratories. Unfortunately, despite their importance in the formation of future scientists, mass spectrometers remain beyond the financial reach of many high schools and colleges. As a result, it is not possible for instructors to take full advantage of this equipment. Therefore, to facilitate accessibility to this tool, we have developed a realistic computer-based simulator. Using this software, students are able to practice their ability to identify the components of the original gas, thereby gaining a better understanding of the underlying physical laws. The software is available as a free download.

  7. Computer-based simulator for catheter insertion training.

    PubMed

    Aloisio, Giovanni; Barone, Luigi; Bergamasco, Massimo; Avizzano, Carlo Alberto; De Paolis, Lucio Tommaso; Franceschini, Marco; Mongelli, Antonio; Pantile, Gianluca; Provenzano, Luciana; Raspolli, Mirko

    2004-01-01

    Minimally invasive surgery procedures are getting common in surgical practice; however the new interventional procedure requires different skills compared to the conventional surgical techniques. The need for training process is very important in order to successfully and safely execute a surgical procedure. Computer-based simulators, with appropriate tactile feedback device, can be an efficient method for facilitating the education and training process. In addition, virtual reality surgical simulators can reduce costs of education and provide realism with regard to tissues behaviour and real-time interaction. This work take into account the results of the HERMES Project (HEmatology Research virtual MEdical System), conceived and managed by Consorzio CETMA-Research Centre; the aim of this project is to build an integrate system in order to simulate a coronary angioplasty intervention. PMID:15544228

  8. Experiential Learning through Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Maynes, Bill; And Others

    1992-01-01

    Describes experiential learning instructional model and simulation for student principals. Describes interactive laser videodisc simulation. Reports preliminary findings about student principal learning from simulation. Examines learning approaches by unsuccessful and successful students and learning levels of model learners. Simulation's success…

  9. Simulating heat addition via mass addition in constant area compressible flows

    NASA Astrophysics Data System (ADS)

    Heiser, W. H.; McClure, W. B.; Wood, C. W.

    1995-01-01

    A study conducted demonstrated the striking similarity between the influence of heat addition and mass addition on compressible flows. These results encourage the belief that relatively modest laboratory experiments employing mass addition can be devised that will reproduce the leading phenomena of heat addition, such as the axial variation of properties, choking, and wall-boundary-layer separation. These suggest that some aspects of the complex behavior of dual-mode ramjet/scramjet combustors could be experimentally evaluated or demonstrated by replacing combustion with less expensive, more easily controlled, and safer mass addition.

  10. Engineering Fracking Fluids with Computer Simulation

    NASA Astrophysics Data System (ADS)

    Shaqfeh, Eric

    2015-11-01

    There are no comprehensive simulation-based tools for engineering the flows of viscoelastic fluid-particle suspensions in fully three-dimensional geometries. On the other hand, the need for such a tool in engineering applications is immense. Suspensions of rigid particles in viscoelastic fluids play key roles in many energy applications. For example, in oil drilling the ``drilling mud'' is a very viscous, viscoelastic fluid designed to shear-thin during drilling, but thicken at stoppage so that the ``cuttings'' can remain suspended. In a related application known as hydraulic fracturing suspensions of solids called ``proppant'' are used to prop open the fracture by pumping them into the well. It is well-known that particle flow and settling in a viscoelastic fluid can be quite different from that which is observed in Newtonian fluids. First, it is now well known that the ``fluid particle split'' at bifurcation cracks is controlled by fluid rheology in a manner that is not understood. Second, in Newtonian fluids, the presence of an imposed shear flow in the direction perpendicular to gravity (which we term a cross or orthogonal shear flow) has no effect on the settling of a spherical particle in Stokes flow (i.e. at vanishingly small Reynolds number). By contrast, in a non-Newtonian liquid, the complex rheological properties induce a nonlinear coupling between the sedimentation and shear flow. Recent experimental data have shown both the shear thinning and the elasticity of the suspending polymeric solutions significantly affects the fluid-particle split at bifurcations, as well as the settling rate of the solids. In the present work, we use the Immersed Boundary Method to develop computer simulations of viscoelastic flow in suspensions of spheres to study these problems. These simulations allow us to understand the detailed physical mechanisms for the remarkable physical behavior seen in practice, and actually suggest design rules for creating new fluid recipes.

  11. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    SciTech Connect

    C. FOSTER; ET AL

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  12. Duality quantum computer and the efficient quantum simulations

    NASA Astrophysics Data System (ADS)

    Wei, Shi-Jie; Long, Gui-Lu

    2016-03-01

    Duality quantum computing is a new mode of a quantum computer to simulate a moving quantum computer passing through a multi-slit. It exploits the particle wave duality property for computing. A quantum computer with n qubits and a qudit simulates a moving quantum computer with n qubits passing through a d-slit. Duality quantum computing can realize an arbitrary sum of unitaries and therefore a general quantum operator, which is called a generalized quantum gate. All linear bounded operators can be realized by the generalized quantum gates, and unitary operators are just the extreme points of the set of generalized quantum gates. Duality quantum computing provides flexibility and a clear physical picture in designing quantum algorithms, and serves as a powerful bridge between quantum and classical algorithms. In this paper, after a brief review of the theory of duality quantum computing, we will concentrate on the applications of duality quantum computing in simulations of Hamiltonian systems. We will show that duality quantum computing can efficiently simulate quantum systems by providing descriptions of the recent efficient quantum simulation algorithm of Childs and Wiebe (Quantum Inf Comput 12(11-12):901-924, 2012) for the fast simulation of quantum systems with a sparse Hamiltonian, and the quantum simulation algorithm by Berry et al. (Phys Rev Lett 114:090502, 2015), which provides exponential improvement in precision for simulating systems with a sparse Hamiltonian.

  13. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Temi Linjewile; Mike Maguire; Adel Sarofim; Connie Senior; Changguan Yang; Hong-Shig Shim

    2004-04-28

    This is the fourteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused primarily on completing a prototype detachable user interface for the framework and on integrating Carnegie Mellon Universities IECM model core with the computational engine. In addition to this work, progress has been made on several other development and modeling tasks for the program. These include: (1) improvements to the infrastructure code of the computational engine, (2) enhancements to the model interfacing specifications, (3) additional development to increase the robustness of all framework components, (4) enhanced coupling of the computational and visualization engine components, (5) a series of detailed simulations studying the effects of gasifier inlet conditions on the heat flux to the gasifier injector, and (6) detailed plans for implementing models for mercury capture for both warm and cold gas cleanup have been created.

  14. Computer-aided simulation study of photomultiplier tubes

    NASA Technical Reports Server (NTRS)

    Zaghloul, Mona E.; Rhee, Do Jun

    1989-01-01

    A computer model that simulates the response of photomultiplier tubes (PMTs) and the associated voltage divider circuit is developed. An equivalent circuit that approximates the operation of the device is derived and then used to develop a computer simulation of the PMT. Simulation results are presented and discussed.

  15. Computer simulation of vasectomy for wolf control

    USGS Publications Warehouse

    Haight, R.G.; Mech, L.D.

    1997-01-01

    Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.

  16. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  17. Computer simulation and the features of novel empirical data.

    PubMed

    Lusk, Greg

    2016-04-01

    In an attempt to determine the epistemic status of computer simulation results, philosophers of science have recently explored the similarities and differences between computer simulations and experiments. One question that arises is whether and, if so, when, simulation results constitute novel empirical data. It is often supposed that computer simulation results could never be empirical or novel because simulations never interact with their targets, and cannot go beyond their programming. This paper argues against this position by examining whether, and under what conditions, the features of empiricality and novelty could be displayed by computer simulation data. I show that, to the extent that certain familiar measurement results have these features, so can some computer simulation results. PMID:27083094

  18. Advanced ERS design using computer simulation

    SciTech Connect

    Melhem, G.A.

    1995-12-31

    There are two schools of thought regarding pressure relief design, shortcut/simplified methods and detailed methods. The shortcut/simplified methods are mostly applicable to non-reactive systems. These methods use direct scale-up techniques to obtain a vent size. Little useful information can be obtained for reaction data such as onset temperatures, activation energy, decompositon stoichiometry, etc. In addition, this approach does not readily provide the ability to perform what-if and sensitivity analysis or data that can be used for post-release mitigation design. The detailed approach advocates a more fundamental approach to pressure relief design, especially for reactive systems. First, the reaction chemistry is qualified using small scale experiments and then this data is coupled with fluid dynamics to design the emergency relief system. In addition to vent sizing information, this approach provides insights into process modification and refinement as well as the establishment of a safe operating envelope. This approach provides necessary flow data for vent containment design (if required), structural support, etc. This approach also allows the direct evaluation of design sensitivity to variables such as temperature, pressure, composition, fill level, etc. on vent sizing while the shortcut approach requires an additional experiment per what-if scenario. This approach meets DIERS technology requirements for two-phase flow and vapor/liquid disengagement and exceeds it in many key areas for reacting systems such as stoichiometry estimation for decomposition reactions, non-ideal solutions effects, continuing reactions in piping and vent containment systems, etc. This paper provides an overview of our proposed equation of state based modeling approach and its computer code implementation. Numerous examples and model validations are also described. 42 refs., 23 figs., 9 tabs.

  19. Computer simulation of FCC riser reactors.

    SciTech Connect

    Chang, S. L.; Golchert, B.; Lottes, S. A.; Petrick, M.; Zhou, C. Q.

    1999-04-20

    A three-dimensional computational fluid dynamics (CFD) code, ICRKFLO, was developed to simulate the multiphase reacting flow system in a fluid catalytic cracking (FCC) riser reactor. The code solve flow properties based on fundamental conservation laws of mass, momentum, and energy for gas, liquid, and solid phases. Useful phenomenological models were developed to represent the controlling FCC processes, including droplet dispersion and evaporation, particle-solid interactions, and interfacial heat transfer between gas, droplets, and particles. Techniques were also developed to facilitate numerical calculations. These techniques include a hybrid flow-kinetic treatment to include detailed kinetic calculations, a time-integral approach to overcome numerical stiffness problems of chemical reactions, and a sectional coupling and blocked-cell technique for handling complex geometry. The copyrighted ICRKFLO software has been validated with experimental data from pilot- and commercial-scale FCC units. The code can be used to evaluate the impacts of design and operating conditions on the production of gasoline and other oil products.

  20. Computational simulation of liquid rocket injector anomalies

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Singhal, A. K.; Tam, L. T.; Davidian, K.

    1986-01-01

    A computer model has been developed to analyze the three-dimensional two-phase reactive flows in liquid fueled rocket combustors. The model is designed to study the influence of liquid propellant injection nonuniformities on the flow pattern, combustion and heat transfer within the combustor. The Eulerian-Lagrangian approach for simulating polidisperse spray flow, evaporation and combustion has been used. Full coupling between the phases is accounted for. A nonorthogonal, body fitted coordinate system along with a conservative control volume formulation is employed. The physical models built into the model include a kappa-epsilon turbulence model, a two-step chemical reaction, and the six-flux radiation model. Semiempirical models are used to describe all interphase coupling terms as well as chemical reaction rates. The purpose of this study was to demonstrate an analytical capability to predict the effects of reactant injection nonuniformities (injection anomalies) on combustion and heat transfer within the rocket combustion chamber. The results show promising application of the model to comprehensive modeling of liquid propellant rocket engines.

  1. Quantitative and Qualitative Simulation in Computer Based Training.

    ERIC Educational Resources Information Center

    Stevens, Albert; Roberts, Burce

    1983-01-01

    Computer-based systems combining quantitative simulation with qualitative tutorial techniques provide learners with sophisticated individualized training. The teaching capabilities and operating procedures of Steamer, a simulated steam plant, are described. (Author/MBR)

  2. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation. PMID:15302205

  3. A computationally efficient particle-simulation method suited to vector-computer architectures

    SciTech Connect

    McDonald, J.D.

    1990-01-01

    Recent interest in a National Aero-Space Plane (NASP) and various Aero-assisted Space Transfer Vehicles (ASTVs) presents the need for a greater understanding of high-speed rarefied flight conditions. Particle simulation techniques such as the Direct Simulation Monte Carlo (DSMC) method are well suited to such problems, but the high cost of computation limits the application of the methods to two-dimensional or very simple three-dimensional problems. This research re-examines the algorithmic structure of existing particle simulation methods and re-structures them to allow efficient implementation on vector-oriented supercomputers. A brief overview of the DSMC method and the Cray-2 vector computer architecture are provided, and the elements of the DSMC method that inhibit substantial vectorization are identified. One such element is the collision selection algorithm. A complete reformulation of underlying kinetic theory shows that this may be efficiently vectorized for general gas mixtures. The mechanics of collisions are vectorizable in the DSMC method, but several optimizations are suggested that greatly enhance performance. Also this thesis proposes a new mechanism for the exchange of energy between vibration and other energy modes. The developed scheme makes use of quantized vibrational states and is used in place of the Borgnakke-Larsen model. Finally, a simplified representation of physical space and boundary conditions is utilized to further reduce the computational cost of the developed method. Comparison to solutions obtained from the DSMC method for the relaxation of internal energy modes in a homogeneous gas, as well as single and multiple specie shock wave profiles, are presented. Additionally, a large scale simulation of the flow about the proposed Aeroassisted Flight Experiment (AFE) vehicle is included as an example of the new computational capability of the developed particle simulation method.

  4. Computer Simulation Methods for Defect Configurations and Nanoscale Structures

    SciTech Connect

    Gao, Fei

    2010-01-01

    This chapter will describe general computer simulation methods, including ab initio calculations, molecular dynamics and kinetic Monte-Carlo method, and their applications to the calculations of defect configurations in various materials (metals, ceramics and oxides) and the simulations of nanoscale structures due to ion-solid interactions. The multiscale theory, modeling, and simulation techniques (both time scale and space scale) will be emphasized, and the comparisons between computer simulation results and exprimental observations will be made.

  5. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio

    2003-01-25

    This is the eighth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two coal types and two gasifier types. Good agreement with DOE computed values has been obtained for the Vision 21 configuration under ''baseline'' conditions. Additional model verification has been performed for the flowing slag model that has been implemented into the CFD based gasifier model. Comparisons for the slag, wall and syngas conditions predicted by our model versus values from predictive models that have been published by other researchers show good agreement. The software infrastructure of the Vision 21 workbench has been modified to use a recently released, upgraded version of SCIRun.

  6. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  7. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  8. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  9. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    NASA Astrophysics Data System (ADS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  10. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    DOE PAGESBeta

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less

  11. Simulation of reliability in multiserver computer networks

    NASA Astrophysics Data System (ADS)

    Minkevičius, Saulius

    2012-11-01

    The performance in terms of reliability of computer multiserver networks motivates this paper. The probability limit theorem is derived on the extreme queue length in open multiserver queueing networks in heavy traffic and applied to a reliability model for multiserver computer networks where we relate the time of failure of a multiserver computer network to the system parameters.

  12. Computational simulations of vorticity enhanced diffusion

    NASA Astrophysics Data System (ADS)

    Vold, Erik L.

    1999-11-01

    Computer simulations are used to investigate a phenomenon of vorticity enhanced diffusion (VED), a net transport and mixing of a passive scalar across a prescribed vortex flow field driven by a background gradient in the scalar quantity. The central issue under study here is the increase in scalar flux down the gradient and across the vortex field. The numerical scheme uses cylindrical coordinates centered with the vortex flow which allows an exact advective solution and 1D or 2D diffusion using simple numerical methods. In the results, the ratio of transport across a localized vortex region in the presence of the vortex flow over that expected for diffusion alone is evaluated as a measure of VED. This ratio is seen to increase dramatically while the absolute flux across the vortex decreases slowly as the diffusion coefficient is decreased. Similar results are found and compared for varying diffusion coefficient, D, or vortex rotation time, τv, for a constant background gradient in the transported scalar vs an interface in the transported quantity, and for vortex flow fields constant in time vs flow which evolves in time from an initial state and with a Schmidt number of order unity. A simple analysis shows that for a small diffusion coefficient, the flux ratio measure of VED scales as the vortex radius over the thickness for mass diffusion in a viscous shear layer within the vortex characterized by (Dτv)1/2. The phenomenon is linear as investigated here and suggests that a significant enhancement of mixing in fluids may be a relatively simple linear process. Discussion touches on how this vorticity enhanced diffusion may be related to mixing in nonlinear turbulent flows.

  13. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.; Ziegler, C.

    1983-01-01

    A software simulator to help NASA in the design of the LMSS was developed. The simulator will be used to study the characteristics of implementation requirements of the LMSS's configuration with specifications as outlined by NASA.

  14. Matching of additive and polarizable force fields for multiscale condensed phase simulations

    PubMed Central

    Baker, Christopher M.; Best, Robert B.

    2013-01-01

    Inclusion of electronic polarization effects is one of the key aspects in which the accuracy of current biomolecular force fields may be improved. The principal drawback of such approaches is the computational cost, which typically ranges from 3 – 10 times that of the equivalent additive model, and may be greater for more sophisticated treatments of polarization or other many-body effects. Here, we present a multiscale approach which may be used to enhance the sampling in simulations with polarizable models, by using the additive model as a tool to explore configuration space. We use a method based on information theory to determine the charges for an additive model that has optimal overlap with the polarizable one, and we demonstrate the feasibility of enhancing sampling via a hybrid replica exchange scheme for several model systems. An additional advantage is that, in the process, we obtain a systematic method for deriving charges for an additive model that will be the natural complement to its polarizable parent. The additive charges are found by an effective coarse-graining of the polarizable force field, rather than by ad hoc procedures. PMID:23997691

  15. Simulation of the stress computation in shells

    NASA Technical Reports Server (NTRS)

    Salama, M.; Utku, S.

    1978-01-01

    A self-teaching computer program is described, whereby the stresses in thin shells can be computed with good accuracy using the best fit approach. The program is designed for use in interactive game mode to allow the structural engineer to learn about (1) the major sources of difficulties and associated errors in the computation of stresses in thin shells, (2) possible ways to reduce the errors, and (3) trade-off between computational cost and accuracy. Included are derivation of the computational approach, program description, and several examples illustrating the program usage.

  16. Cognitive Effects from Process Learning with Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Breuer, Klaus; Kummer, Ruediger

    1990-01-01

    Discusses content learning versus process learning, describes process learning with computer-based simulations, and highlights an empirical study on the effects of process learning with problem-oriented, computer-managed simulations in technical vocational education classes in West Germany. Process learning within a model of the cognitive system…

  17. Computer Simulation Models of Economic Systems in Higher Education.

    ERIC Educational Resources Information Center

    Smith, Lester Sanford

    The increasing complexity of educational operations make analytical tools, such as computer simulation models, especially desirable for educational administrators. This MA thesis examined the feasibility of developing computer simulation models of economic systems in higher education to assist decision makers in allocating resources. The report…

  18. Explore Effective Use of Computer Simulations for Physics Education

    ERIC Educational Resources Information Center

    Lee, Yu-Fen; Guo, Yuying

    2008-01-01

    The dual purpose of this article is to provide a synthesis of the findings related to the use of computer simulations in physics education and to present implications for teachers and researchers in science education. We try to establish a conceptual framework for the utilization of computer simulations as a tool for learning and instruction in…

  19. The Link between Computer Simulations and Social Studies Learning: Debriefing.

    ERIC Educational Resources Information Center

    Chiodo, John J.; Flaim, Mary L.

    1993-01-01

    Asserts that debriefing is the missing link between learning achievement and simulations in social studies. Maintains that teachers who employ computer-assisted instruction must utilize effective debriefing activities. Provides a four-step debriefing model using the computer simulation, Oregon Trail. (CFR)

  20. The Role of Computer Simulations in Engineering Education.

    ERIC Educational Resources Information Center

    Smith, P. R.; Pollard, D.

    1986-01-01

    Discusses role of computer simulation in complementing and extending conventional components of undergraduate engineering education process in United Kingdom universities and polytechnics. Aspects of computer-based learning are reviewed (laboratory simulation, lecture and tutorial support, inservice teacher education) with reference to programs in…

  1. How Effective Is Instructional Support for Learning with Computer Simulations?

    ERIC Educational Resources Information Center

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  2. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    ERIC Educational Resources Information Center

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  3. New Pedagogies on Teaching Science with Computer Simulations

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  4. Nonlinear simulations with and computational issues for NIMROD

    SciTech Connect

    Sovinec, C.R.

    1998-12-31

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

  5. Simulation models for computational plasma physics: Concluding report

    SciTech Connect

    Hewett, D.W.

    1994-03-05

    In this project, the authors enhanced their ability to numerically simulate bounded plasmas that are dominated by low-frequency electric and magnetic fields. They moved towards this goal in several ways; they are now in a position to play significant roles in the modeling of low-frequency electromagnetic plasmas in several new industrial applications. They have significantly increased their facility with the computational methods invented to solve the low frequency limit of Maxwell`s equations (DiPeso, Hewett, accepted, J. Comp. Phys., 1993). This low frequency model is called the Streamlined Darwin Field model (SDF, Hewett, Larson, and Doss, J. Comp. Phys., 1992) has now been implemented in a fully non-neutral SDF code BEAGLE (Larson, Ph.D. dissertation, 1993) and has further extended to the quasi-neutral limit (DiPeso, Hewett, Comp. Phys. Comm., 1993). In addition, they have resurrected the quasi-neutral, zero-electron-inertia model (ZMR) and began the task of incorporating internal boundary conditions into this model that have the flexibility of those in GYMNOS, a magnetostatic code now used in ion source work (Hewett, Chen, ICF Quarterly Report, July--September, 1993). Finally, near the end of this project, they invented a new type of banded matrix solver that can be implemented on a massively parallel computer -- thus opening the door for the use of all their ADI schemes on these new computer architecture`s (Mattor, Williams, Hewett, submitted to Parallel Computing, 1993).

  6. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  7. Computer simulations of solvation dynamics in lithium clay materials

    SciTech Connect

    Greathouse, J.A.

    1997-12-31

    Monte Carlo and molecular dynamics computer simulations were performed concurrently for the first time on lithium-smectites (montmorillonite, beidellite, and hectorite) at very low water content (about 0.5 monolayer). These simulation conditions were selected to provide a comparison with existing NMR, IR, neutron scattering, and X-ray diffraction data which have been interpreted in terms of inner-sphere (IS) Li surface complexes solvated by three water molecules exhibiting hindered rotational degrees of freedom. Our simulations predicted layer spacings (c-axis direction) ranging from 10.32 {angstrom} (Hectorite) to 11.93 {angstrom} (beidellite). Both IS and outer-sphere (OS) Li surface complexes formed in the interlayers of montmorillonite, whereas only one type of surface complex formed in the interlayers of beidellite (IS) and hectorite (OS). Lithium ions were solvated by 2, 3, or 4 water molecules. Some evidence of Li-hydrate rotation was seen for beidellite, but the other smectites showed no Li-hydrate rotational motion. However water molecules were observed to rotate about their C{sub 2} axes of symmetry in montmorillonite. Additional spectroscopic data are needed to resolve the differences between the simulation predictions and current experimental interpretations.

  8. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  9. Atomistic Simulations of Ti Additions to NiAl

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Noebe, Ronald D.; Garg, Anita; Ferrante, John; Amador, Carlos

    1997-01-01

    The development of more efficient engines and power plants for future supersonic transports depends on the advancement of new high-temperature materials with temperature capabilities exceeding those of Ni-based superalloys. Having theoretical modelling techniques to aid in the design of these alloys would greatly facilitate this development. The present paper discusses a successful attempt to correlate theoretical predictions of alloy properties with experimental confirmation for ternary NiAl-Ti alloys. The B.F.S. (Bozzolo-Ferrante-Smith) method for alloys is used to predict the solubility limit and site preference energies for Ti additions of 1 to 25 at.% to NiAl. The results show the solubility limit to be around 5% Ti, above which the formation of Heusler precipitates is favored. These results were confirmed by transmission electron microscopy performed on a series of NiAl-Ti alloys.

  10. Atomistic simulations of Ti additions to NiAl

    SciTech Connect

    Bozzolo, G.; Noebe, R.D.; Garg, A.; Ferrante, J.; Amador, C.

    1997-12-31

    The development of more efficient engines and power plants for future supersonic transports depend on the advancement of new high-temperature materials with temperature capabilities exceeding those of Ni-based superalloys. Having theoretical modelling techniques to aid in the design of these alloys would greatly facilitate this development. The present paper discusses a successful attempt to correlate theoretical predictions of alloy properties with experimental confirmation for ternary NiAl-Ti alloys. The B.F.S. (Bozzolo- Ferrante-Smith) method for alloys is used to predict the solubility limit and site preference energies for Ti additions of 1 to 25 at. % to NiAl. The results show the solubility limit to be around 5% Ti, above which the formation of Heusler precipitates is favored. These results were confirmed by transmission electron microscopy performed on a series of NiAl-Ti alloys.

  11. GPU-accelerated micromagnetic simulations using cloud computing

    NASA Astrophysics Data System (ADS)

    Jermain, C. L.; Rowlands, G. E.; Buhrman, R. A.; Ralph, D. C.

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  12. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  13. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  14. Computer simulated plant design for waste minimization/pollution prevention

    SciTech Connect

    Bumble, S.

    2000-07-01

    The book discusses several paths to pollution prevention and waste minimization by using computer simulation programs. It explains new computer technologies used in the field of pollution prevention and waste management; provides information pertaining to overcoming technical, economic, and environmental barriers to waste reduction; gives case-studies from industries; and covers computer aided flow sheet design and analysis for nuclear fuel reprocessing.

  15. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  16. Computer Simulations as an Integral Part of Intermediate Macroeconomics.

    ERIC Educational Resources Information Center

    Millerd, Frank W.; Robertson, Alastair R.

    1987-01-01

    Describes the development of two interactive computer simulations which were fully integrated with other course materials. The simulations illustrate the effects of various real and monetary "demand shocks" on aggregate income, interest rates, and components of spending and economic output. Includes an evaluation of the simulations' effects on…

  17. Genetic Crossing vs Cloning by Computer Simulation

    NASA Astrophysics Data System (ADS)

    Dasgupta, Subinay

    We perform Monte Carlo simulation using Penna's bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  18. Genetic crossing vs cloning by computer simulation

    SciTech Connect

    Dasgupta, S.

    1997-06-01

    We perform Monte Carlo simulation using Penna`s bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  19. Spatial Learning and Computer Simulations in Science

    ERIC Educational Resources Information Center

    Lindgren, Robb; Schwartz, Daniel L.

    2009-01-01

    Interactive simulations are entering mainstream science education. Their effects on cognition and learning are often framed by the legacy of information processing, which emphasized amodal problem solving and conceptual organization. In contrast, this paper reviews simulations from the vantage of research on perception and spatial learning,…

  20. Additions and Improvements to the FLASH Code for Simulating High Energy Density Physics Experiments

    NASA Astrophysics Data System (ADS)

    Lamb, D. Q.; Daley, C.; Dubey, A.; Fatenejad, M.; Flocke, N.; Graziani, C.; Lee, D.; Tzeferacos, P.; Weide, K.

    2015-11-01

    FLASH is an open source, finite-volume Eulerian, spatially adaptive radiation hydrodynamics and magnetohydrodynamics code that incorporates capabilities for a broad range of physical processes, performs well on a wide range of computer architectures, and has a broad user base. Extensive capabilities have been added to FLASH to make it an open toolset for the academic high energy density physics (HEDP) community. We summarize these capabilities, with particular emphasis on recent additions and improvements. These include advancements in the optical ray tracing laser package, with methods such as bi-cubic 2D and tri-cubic 3D interpolation of electron number density, adaptive stepping and 2nd-, 3rd-, and 4th-order Runge-Kutta integration methods. Moreover, we showcase the simulated magnetic field diagnostic capabilities of the code, including induction coils, Faraday rotation, and proton radiography. We also describe several collaborations with the National Laboratories and the academic community in which FLASH has been used to simulate HEDP experiments. This work was supported in part at the University of Chicago by the DOE NNSA ASC through the Argonne Institute for Computing in Science under field work proposal 57789; and the NSF under grant PHY-0903997.

  1. High Fidelity Simulation of a Computer Room

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim; Chan, William; Chaderjian, Neal; Pandya, Shishir

    2005-01-01

    This viewgraph presentation reviews NASA's Columbia supercomputer and the mesh technology used to test the adequacy of the fluid and cooling of a computer room. A technical description of the Columbia supercomputer is also presented along with its performance capability.

  2. Some theoretical issues on computer simulations

    SciTech Connect

    Barrett, C.L.; Reidys, C.M.

    1998-02-01

    The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.

  3. Wall-models for large eddy simulation based on a generic additive-filter formulation

    NASA Astrophysics Data System (ADS)

    Sanchez Rocha, Martin

    Based on the philosophy of only resolving the large scales of turbulent motion, Large Eddy Simulation (LES) has demonstrated potential to provide high-fidelity turbulence simulations at low computational cost. However, when the scales that control the turbulence in a particular flow are not large, LES has to increase significantly its computational cost to provide accurate predictions. This is the case in wall-bounded flows, where the grid resolution required by LES to resolve the near-wall structures is close to the requirements to resolve the smallest dissipative scales in turbulence. Therefore, to reduce this demanding requirement, it has been proposed to model the near-wall region with Reynolds-Averaged Navier-Stokes (RANS) models, in what is known as hybrid RANS/LES approach. In this work, the mathematical implications of merging two different turbulence modeling approaches are addressed by deriving the exact hybrid RANS/LES Navier-Stokes equations. These equations are derived by introducing an additive-filter, which linearly combines the RANS and LES operators with a blending function. The equations derived with the additive-filter predict additional hybrid terms, which represent the interactions between RANS and LES formulations. Theoretically, the prediction of the hybrid terms demonstrates that the hybridization of the two approaches cannot be accomplished only by the turbulence model equations, as it is claimed in current hybrid RANS/LES models. The importance of the exact hybrid RANS/LES equations is demonstrated by conducting numerical calculations on a turbulent flat-plate boundary layer. Results indicate that the hybrid terms help to maintain an equilibrated model transition when the hybrid formulation switches from RANS to LES. Results also indicate, that when the hybrid terms are not included, the accuracy of the calculations strongly relies on the blending function implemented in the additive-filter. On the other hand, if the exact equations are

  4. Use of advanced computers for aerodynamic flow simulation

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Ballhaus, W. F.

    1980-01-01

    The current and projected use of advanced computers for large-scale aerodynamic flow simulation applied to engineering design and research is discussed. The design use of mature codes run on conventional, serial computers is compared with the fluid research use of new codes run on parallel and vector computers. The role of flow simulations in design is illustrated by the application of a three dimensional, inviscid, transonic code to the Sabreliner 60 wing redesign. Research computations that include a more complete description of the fluid physics by use of Reynolds averaged Navier-Stokes and large-eddy simulation formulations are also presented. Results of studies for a numerical aerodynamic simulation facility are used to project the feasibility of design applications employing these more advanced three dimensional viscous flow simulations.

  5. Neutron stimulated emission computed tomography: a Monte Carlo simulation approach.

    PubMed

    Sharma, A C; Harrawood, B P; Bender, J E; Tourassi, G D; Kapadia, A J

    2007-10-21

    A Monte Carlo simulation has been developed for neutron stimulated emission computed tomography (NSECT) using the GEANT4 toolkit. NSECT is a new approach to biomedical imaging that allows spectral analysis of the elements present within the sample. In NSECT, a beam of high-energy neutrons interrogates a sample and the nuclei in the sample are stimulated to an excited state by inelastic scattering of the neutrons. The characteristic gammas emitted by the excited nuclei are captured in a spectrometer to form multi-energy spectra. Currently, a tomographic image is formed using a collimated neutron beam to define the line integral paths for the tomographic projections. These projection data are reconstructed to form a representation of the distribution of individual elements in the sample. To facilitate the development of this technique, a Monte Carlo simulation model has been constructed from the GEANT4 toolkit. This simulation includes modeling of the neutron beam source and collimation, the samples, the neutron interactions within the samples, the emission of characteristic gammas, and the detection of these gammas in a Germanium crystal. In addition, the model allows the absorbed radiation dose to be calculated for internal components of the sample. NSECT presents challenges not typically addressed in Monte Carlo modeling of high-energy physics applications. In order to address issues critical to the clinical development of NSECT, this paper will describe the GEANT4 simulation environment and three separate simulations performed to accomplish three specific aims. First, comparison of a simulation to a tomographic experiment will verify the accuracy of both the gamma energy spectra produced and the positioning of the beam relative to the sample. Second, parametric analysis of simulations performed with different user-defined variables will determine the best way to effectively model low energy neutrons in tissue, which is a concern with the high hydrogen content in

  6. Parallel Computing Environments and Methods for Power Distribution System Simulation

    SciTech Connect

    Lu, Ning; Taylor, Zachary T.; Chassin, David P.; Guttromson, Ross T.; Studham, Scott S.

    2005-11-10

    The development of cost-effective high-performance parallel computing on multi-processor super computers makes it attractive to port excessively time consuming simulation software from personal computers (PC) to super computes. The power distribution system simulator (PDSS) takes a bottom-up approach and simulates load at appliance level, where detailed thermal models for appliances are used. This approach works well for a small power distribution system consisting of a few thousand appliances. When the number of appliances increases, the simulation uses up the PC memory and its run time increases to a point where the approach is no longer feasible to model a practical large power distribution system. This paper presents an effort made to port a PC-based power distribution system simulator (PDSS) to a 128-processor shared-memory super computer. The paper offers an overview of the parallel computing environment and a description of the modification made to the PDSS model. The performances of the PDSS running on a standalone PC and on the super computer are compared. Future research direction of utilizing parallel computing in the power distribution system simulation is also addressed.

  7. Computational field simulation of temporally deforming geometries

    SciTech Connect

    Boyalakuntla, K.; Soni, B.K.; Thornburg, H.J.

    1996-12-31

    A NURBS based moving grid generation technique is presented to simulate temporally deforming geometries. Grid generation for a complex configuration can be a time consuming process and temporally varying geometries necessitate the regeneration of such a grid for every time step. The Non Uniform Rational B Spline (NURBS) based control point information is used for geometry description. The parametric definition of the NURBS is utilized in the development of the methodology to generate well distributed grid in a timely manner. The numerical simulation involving temporally deforming geometry is accomplished by appropriately linking to a unsteady, multi-block, thin layer Navier-Stokes solver. The present method greatly reduces CPU requirements for time dependent remeshing, facilitating the simulation of more complex unsteady problems. This current effort is the first step towards multidisciplinary design optimization, which involves coupling aerodynamic heat transfer and structural analysis. Applications include simulation of temporally deforming bodies.

  8. Computer simulation of water reclamation processors

    NASA Technical Reports Server (NTRS)

    Fisher, John W.; Hightower, T. M.; Flynn, Michael T.

    1991-01-01

    The development of detailed simulation models of water reclamation processors based on the ASPEN PLUS simulation program is discussed. Individual models have been developed for vapor compression distillation, vapor phase catalytic ammonia removal, and supercritical water oxidation. These models are used for predicting the process behavior. Particular attention is given to methodology which is used to complete this work, and the insights which are gained by this type of model development.

  9. Estimating computer communication network performance using network simulations

    SciTech Connect

    Garcia, A.B.

    1985-01-01

    A generalized queuing model simulation of store-and-forward computer communication networks is developed and implemented using Simulation Language for Alternative Modeling (SLAM). A baseline simulation model is validated by comparison with published analytic models. The baseline model is expanded to include an ACK/NAK data link protocol, four-level message precedence, finite queues, and a response traffic scenario. Network performance, as indicated by average message delay and message throughput, is estimated using the simulation model.

  10. Computer simulation results of attitude estimation of earth orbiting satellites

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1976-01-01

    Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.

  11. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  12. A scalable parallel black oil simulator on distributed memory parallel computers

    NASA Astrophysics Data System (ADS)

    Wang, Kun; Liu, Hui; Chen, Zhangxin

    2015-11-01

    This paper presents our work on developing a parallel black oil simulator for distributed memory computers based on our in-house parallel platform. The parallel simulator is designed to overcome the performance issues of common simulators that are implemented for personal computers and workstations. The finite difference method is applied to discretize the black oil model. In addition, some advanced techniques are employed to strengthen the robustness and parallel scalability of the simulator, including an inexact Newton method, matrix decoupling methods, and algebraic multigrid methods. A new multi-stage preconditioner is proposed to accelerate the solution of linear systems from the Newton methods. Numerical experiments show that our simulator is scalable and efficient, and is capable of simulating extremely large-scale black oil problems with tens of millions of grid blocks using thousands of MPI processes on parallel computers.

  13. 19 CFR 201.14 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 201.14 Section 201.14 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF GENERAL APPLICATION Initiation and Conduct of Investigations § 201.14 Computation of time,...

  14. Avoiding pitfalls in simulating real-time computer systems

    NASA Technical Reports Server (NTRS)

    Smith, R. S.

    1984-01-01

    The software simulation of a computer target system on a computer host system, known as an interpretive computer simulator (ICS), functionally models and implements the action of the target hardware. For an ICS to function as efficiently as possible and to avoid certain pitfalls in designing an ICS, it is important that the details of the hardware architectural design of both the target and the host computers be known. This paper discusses both host selection considerations and ICS design features that, without proper consideration, could make the resulting ICS too slow to use or too costly to maintain and expand.

  15. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  16. Two inviscid computational simulations of separated flow about airfoils

    NASA Technical Reports Server (NTRS)

    Barnwell, R. W.

    1976-01-01

    Two inviscid computational simulations of separated flow about airfoils are described. The basic computational method is the line relaxation finite-difference method. Viscous separation is approximated with inviscid free-streamline separation. The point of separation is specified, and the pressure in the separation region is calculated. In the first simulation, the empiricism of constant pressure in the separation region is employed. This empiricism is easier to implement with the present method than with singularity methods. In the second simulation, acoustic theory is used to determine the pressure in the separation region. The results of both simulations are compared with experiment.

  17. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  18. Computer simulation of bounded plasma systems

    SciTech Connect

    Lawson, W.S.

    1987-03-05

    The physical and numerical problems of kinetic simulation of a bounded electrostatic plasma system in one planar dimension are examined, and solutions to them are presented. These problems include particle absorption, reflection and emission at boundaries, the solution of Poisson's equation under non-periodic boundary conditions, and the treatment of an external circuit connecting the boundaries. Some comments are also made regarding the problems of higher dimensions. The methods which are described here are implemented in a code named PDW1, which is available from Professor C.K. Birdsall, Plasma Theory and Simulation Group, Cory Hall, University of California, Berkeley, CA 94720.

  19. Computer Simulations in the Science Classroom.

    ERIC Educational Resources Information Center

    Richards, John; And Others

    1992-01-01

    Explorer is an interactive environment based on a constructivist epistemology of learning that integrates animated computer models with analytic capabilities for learning science. The system includes graphs, a spreadsheet, scripting, and interactive tools. Two examples involving the dynamics of colliding objects and electric circuits illustrate…

  20. Computer Simulation of Electric Field Lines.

    ERIC Educational Resources Information Center

    Kirkup, L.

    1985-01-01

    Describes a computer program which plots electric field line plots. Includes program listing, sample diagrams produced on a BBC model B microcomputer (which could be produced on other microcomputers by modifying the program), and a discussion of the properties of field lines. (JN)

  1. Using Computer Simulations to Integrate Learning.

    ERIC Educational Resources Information Center

    Liao, Thomas T.

    1983-01-01

    Describes the primary design criteria and the classroom activities involved in "The Yellow Light Problem," a minicourse on decision making in the secondary school Mathematics, Engineering and Science Achievement (MESA) program in California. Activities include lectures, discussions, science and math labs, computer labs, and development of…

  2. Computational Fluid Dynamic simulations of pipe elbow flow.

    SciTech Connect

    Homicz, Gregory Francis

    2004-08-01

    One problem facing today's nuclear power industry is flow-accelerated corrosion and erosion in pipe elbows. The Korean Atomic Energy Research Institute (KAERI) is performing experiments in their Flow-Accelerated Corrosion (FAC) test loop to better characterize these phenomena, and develop advanced sensor technologies for the condition monitoring of critical elbows on a continuous basis. In parallel with these experiments, Sandia National Laboratories is performing Computational Fluid Dynamic (CFD) simulations of the flow in one elbow of the FAC test loop. The simulations are being performed using the FLUENT commercial software developed and marketed by Fluent, Inc. The model geometry and mesh were created using the GAMBIT software, also from Fluent, Inc. This report documents the results of the simulations that have been made to date; baseline results employing the RNG k-e turbulence model are presented. The predicted value for the diametrical pressure coefficient is in reasonably good agreement with published correlations. Plots of the velocities, pressure field, wall shear stress, and turbulent kinetic energy adjacent to the wall are shown within the elbow section. Somewhat to our surprise, these indicate that the maximum values of both wall shear stress and turbulent kinetic energy occur near the elbow entrance, on the inner radius of the bend. Additional simulations were performed for the same conditions, but with the RNG k-e model replaced by either the standard k-{var_epsilon}, or the realizable k-{var_epsilon} turbulence model. The predictions using the standard k-{var_epsilon} model are quite similar to those obtained in the baseline simulation. However, with the realizable k-{var_epsilon} model, more significant differences are evident. The maximums in both wall shear stress and turbulent kinetic energy now appear on the outer radius, near the elbow exit, and are {approx}11% and 14% greater, respectively, than those predicted in the baseline calculation

  3. Numerical simulation of supersonic wake flow with parallel computers

    SciTech Connect

    Wong, C.C.; Soetrisno, M.

    1995-07-01

    Simulating a supersonic wake flow field behind a conical body is a computing intensive task. It requires a large number of computational cells to capture the dominant flow physics and a robust numerical algorithm to obtain a reliable solution. High performance parallel computers with unique distributed processing and data storage capability can provide this need. They have larger computational memory and faster computing time than conventional vector computers. We apply the PINCA Navier-Stokes code to simulate a wind-tunnel supersonic wake experiment on Intel Gamma, Intel Paragon, and IBM SP2 parallel computers. These simulations are performed to study the mean flow in the near wake region of a sharp, 7-degree half-angle, adiabatic cone at Mach number 4.3 and freestream Reynolds number of 40,600. Overall the numerical solutions capture the general features of the hypersonic laminar wake flow and compare favorably with the wind tunnel data. With a refined and clustering grid distribution in the recirculation zone, the calculated location of the rear stagnation point is consistent with the 2D axisymmetric and 3D experiments. In this study, we also demonstrate the importance of having a large local memory capacity within a computer node and the effective utilization of the number of computer nodes to achieve good parallel performance when simulating a complex, large-scale wake flow problem.

  4. Computer simulation of on-orbit manned maneuvering unit operations

    NASA Technical Reports Server (NTRS)

    Stuart, G. M.; Garcia, K. D.

    1986-01-01

    Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.

  5. Advanced Simulation and Computing Business Plan

    SciTech Connect

    Rummel, E.

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  6. Multidimensional computer simulation of Stirling cycle engines

    NASA Astrophysics Data System (ADS)

    Hall, Charles A.; Porsching, Thomas A.

    1992-07-01

    This report summarizes the activities performed under NASA-Grant NAG3-1097 during 1991. During that period, work centered on the following tasks: (1) to investigate more effective solvers for ALGAE; (2) to modify the plotting package for ALGAE; and (3) to validate ALGAE by simulating oscillating flow problems similar to those studied by Kurzweg and Ibrahim.

  7. Multidimensional computer simulation of Stirling cycle engines

    NASA Technical Reports Server (NTRS)

    Hall, Charles A.; Porsching, Thomas A.

    1992-01-01

    This report summarizes the activities performed under NASA-Grant NAG3-1097 during 1991. During that period, work centered on the following tasks: (1) to investigate more effective solvers for ALGAE; (2) to modify the plotting package for ALGAE; and (3) to validate ALGAE by simulating oscillating flow problems similar to those studied by Kurzweg and Ibrahim.

  8. Student Ecosystems Problem Solving Using Computer Simulation.

    ERIC Educational Resources Information Center

    Howse, Melissa A.

    The purpose of this study was to determine the procedural knowledge brought to, and created within, a pond ecology simulation by students. Environmental Decision Making (EDM) is an ecosystems modeling tool that allows users to pose their own problems and seek satisfying solutions. Of specific interest was the performance of biology majors who had…

  9. Process Training Derived from a Computer Simulation Theory

    ERIC Educational Resources Information Center

    Holzman, Thomas G.; And Others

    1976-01-01

    Discusses a study which investigated whether a computer simulation model could suggest subroutines that were instructable and whether instruction on these subroutines could facilitate subjects' solutions to the problem task. (JM)

  10. Development and Formative Evaluation of Computer Simulated College Chemistry Experiments.

    ERIC Educational Resources Information Center

    Cavin, Claudia S.; Cavin, E. D.

    1978-01-01

    This article describes the design, preparation, and initial evaluation of a set of computer-simulated chemistry experiments. The experiments entailed the use of an atomic emission spectroscope and a single-beam visible absorption spectrophometer. (Author/IRT)

  11. Computer Simulations as a Teaching Tool in Community Colleges

    ERIC Educational Resources Information Center

    Grimm, Floyd M., III

    1978-01-01

    Describes the implementation of a computer assisted instruction program at Harford Community College. Eight different biology simulation programs are used covering topics in ecology, genetics, biochemistry, and sociobiology. (MA)

  12. MINEXP, A Computer-Simulated Mineral Exploration Program

    ERIC Educational Resources Information Center

    Smith, Michael J.; And Others

    1978-01-01

    This computer simulation is designed to put students into a realistic decision making situation in mineral exploration. This program can be used with different exploration situations such as ore deposits, petroleum, ground water, etc. (MR)

  13. Issues in Computer Simulation in Military Maintenance Training

    ERIC Educational Resources Information Center

    Brock, John F.

    1978-01-01

    This article discusses the state of computer-based simulation, reviews the early phases of ISD, suggests that current ISD approaches are missing critical inputs, and proposes a research and development program. (Author)

  14. An Exercise in Biometrical Genetics Based on a Computer Simulation.

    ERIC Educational Resources Information Center

    Murphy, P. J.

    1983-01-01

    Describes an exercise in biometrical genetics based on the noninteractive use of a computer simulation of a wheat hydridization program. Advantages of using the material in this way are also discussed. (Author/JN)

  15. GATE Monte Carlo simulation in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  16. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  17. Computer Simulated Growth of Icosahedral Glass

    NASA Astrophysics Data System (ADS)

    Leino, Y. A. J.; Salomaa, M. M.

    1990-01-01

    One possible model for materials displaying classically forbidden symmetry properties (apart from perfect quasicrystals) is the icosahedral glass model. We simulate the random growth of two types of two-dimensional icosahedral glasses consisting of the Penrose tiles, First we restrict the growth with the arrow rules, then we let the structure develop totally freely. The diffraction patterns have a clear five-fold symmetry in both cases. The diffraction peak intensities do not differ, but shapes of the central peaks vary depending on whether the arrow rules are imposed or not. Finally, we show that the half-width of the central peak decreases when the size of the simulation increases until a finite disorder-limited value is achieved. This phenomenon is in agreement with the behaviour of physical quasicrystallites and in contradiction with perfect mathematical quasicrystals which have Bragg peaks of zero width.

  18. Computational Aerothermodynamic Simulation Issues on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; White, Jeffery A.

    2004-01-01

    The synthesis of physical models for gas chemistry and turbulence from the structured grid codes LAURA and VULCAN into the unstructured grid code FUN3D is described. A directionally Symmetric, Total Variation Diminishing (STVD) algorithm and an entropy fix (eigenvalue limiter) keyed to local cell Reynolds number are introduced to improve solution quality for hypersonic aeroheating applications. A simple grid-adaptation procedure is incorporated within the flow solver. Simulations of flow over an ellipsoid (perfect gas, inviscid), Shuttle Orbiter (viscous, chemical nonequilibrium) and comparisons to the structured grid solvers LAURA (cylinder, Shuttle Orbiter) and VULCAN (flat plate) are presented to show current capabilities. The quality of heating in 3D stagnation regions is very sensitive to algorithm options in general, high aspect ratio tetrahedral elements complicate the simulation of high Reynolds number, viscous flow as compared to locally structured meshes aligned with the flow.

  19. Computer simulation of a geomagnetic substorm

    NASA Technical Reports Server (NTRS)

    Lyon, J. G.; Brecht, S. H.; Huba, J. D.; Fedder, J. A.; Palmadesso, P. J.

    1981-01-01

    A global two-dimensional simulation of a substormlike process occurring in earth's magnetosphere is presented. The results are consistent with an empirical substorm model - the neutral-line model. Specifically, the introduction of a southward interplanetary magnetic field forms an open magnetosphere. Subsequently, a substorm neutral line forms at about 15 earth radii or closer in the magnetotail, and plasma sheet thinning and plasma acceleration occur. Eventually the substorm neutral line moves tailward toward its presubstorm position.

  20. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.

    1981-01-01

    A molecular dynamics technique based upon Lennard-Jones type pair interactions is used to investigate time-dependent as well as equilibrium properties. The case study deals with systems containing Si and O atoms. In this case a more involved potential energy function (PEF) is employed and the system is simulated via a Monte-Carlo procedure. This furnishes the equilibrium properties of the system at its interfaces and surfaces as well as in the bulk.

  1. Evaluation of a Computer Simulation in a Therapeutics Case Discussion.

    ERIC Educational Resources Information Center

    Kinkade, Raenel E.; And Others

    1995-01-01

    A computer program was used to simulate a case presentation in pharmacotherapeutics. Students (n=24) used their knowledge of the disease (glaucoma) and various topical agents on the computer program's formulary to "treat" the patient. Comparison of results with a control group found the method as effective as traditional case presentation on…

  2. Use of Computer Simulations in Microbial and Molecular Genetics.

    ERIC Educational Resources Information Center

    Wood, Peter

    1984-01-01

    Describes five computer programs: four simulations of genetic and physical mapping experiments and one interactive learning program on the genetic coding mechanism. The programs were originally written in BASIC for the VAX-11/750 V.3. mainframe computer and have been translated into Applesoft BASIC for Apple IIe microcomputers. (JN)

  3. COFLO: A Computer Aid for Teaching Ecological Simulation.

    ERIC Educational Resources Information Center

    Le vow, Roy B.

    A computer-assisted course was designed to provide students with an understanding of modeling and simulation techniques in quantitiative ecology. It deals with continuous systems and has two segments. One develops mathematical and computer tools, beginning with abstract systems and their relation to physical systems. Modeling principles are next…

  4. Coached, Interactive Computer Simulations: A New Technology for Training.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    This paper provides an overview of a prototype simulation-centered intelligent computer-based training (CBT) system--implemented using expert system technology--which provides: (1) an environment in which trainees can learn and practice complex skills; (2) a computer-based coach or mentor to critique performance, suggest improvements, and provide…

  5. Generating dynamic simulations of movement using computed muscle control.

    PubMed

    Thelen, Darryl G; Anderson, Frank C; Delp, Scott L

    2003-03-01

    Computation of muscle excitation patterns that produce coordinated movements of muscle-actuated dynamic models is an important and challenging problem. Using dynamic optimization to compute excitation patterns comes at a large computational cost, which has limited the use of muscle-actuated simulations. This paper introduces a new algorithm, which we call computed muscle control, that uses static optimization along with feedforward and feedback controls to drive the kinematic trajectory of a musculoskeletal model toward a set of desired kinematics. We illustrate the algorithm by computing a set of muscle excitations that drive a 30-muscle, 3-degree-of-freedom model of pedaling to track measured pedaling kinematics and forces. Only 10 min of computer time were required to compute muscle excitations that reproduced the measured pedaling dynamics, which is over two orders of magnitude faster than conventional dynamic optimization techniques. Simulated kinematics were within 1 degrees of experimental values, simulated pedal forces were within one standard deviation of measured pedal forces for nearly all of the crank cycle, and computed muscle excitations were similar in timing to measured electromyographic patterns. The speed and accuracy of this new algorithm improves the feasibility of using detailed musculoskeletal models to simulate and analyze movement. PMID:12594980

  6. Teaching Macroeconomics with a Computer Simulation. Final Report.

    ERIC Educational Resources Information Center

    Dolbear, F. Trenery, Jr.

    The study of macroeconomics--the determination and control of aggregative variables such as gross national product, unemployment and inflation--may be facilitated by the use of a computer simulation policy game. An aggregative model of the economy was constructed and programed for a computer and (hypothetical) historical data were generated. The…

  7. The Use of Computer Simulations in High School Curricula.

    ERIC Educational Resources Information Center

    Visich, Marian, Jr.; Braun, Ludwig

    The Huntington Computer Project has developed 17 simulation games which can be used for instructional purposes in high schools. These games were designed to run on digital computers and to deal with material from either biology, physics, or social studies. Distribution was achieved through the Digital Equipment Corporation, which disseminated…

  8. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  9. A New Boundary Condition for Computer Simulations of Interfacial Systems

    SciTech Connect

    Wong, Ka-Yiu; Pettitt, Bernard M.; Montgomery, B.

    2000-08-18

    A new boundary condition for computer simulations of interfacial systems is presented. The simulation box used in this boundary condition is the asymmetric unit of space group Pb, and it contains only one interface. Compared to the simulation box using common periodic boundary conditions which contains two interfaces, the number of particles in the simulation is reduced by half. This boundary condition was tested against common periodic boundary conditions in molecular dynamic simulations of liquid water interacting with hydroxylated silica surfaces. It yielded results essentially identical to periodic boundary condition and consumed less CPU time for comparable statistics.

  10. A new boundary condition for computer simulations of interfacial systems

    NASA Astrophysics Data System (ADS)

    Wong, Ka-Yiu; Pettitt, B. Montgomery

    2000-08-01

    A new boundary condition for computer simulations of interfacial systems is presented. The simulation box used in this boundary condition is the asymmetric unit of space group Pb, and it contains only one interface. Compared to the simulation box using common periodic boundary conditions which contains two interfaces, the number of particles in the simulation is reduced by half. This boundary condition was tested against common periodic boundary conditions in molecular dynamic simulations of liquid water interacting with hydroxylated silica surfaces. It yielded results essentially identical to periodic boundary condition and consumed less CPU time for comparable statistics.

  11. Developing Computer-Based Interactive Video Simulations on Questioning Strategies.

    ERIC Educational Resources Information Center

    Rogers, Randall; Rieff, Judith

    1989-01-01

    This article presents a rationale for development and implementation of computer based interactive videotape (CBIV) in preservice teacher education; identifies advantages of CBIV simulations over other practice exercises; describes economical production procedures; discusses implications and importance of these simulations; and makes…

  12. Computer Simulation of Incomplete-Data Interpretation Exercise.

    ERIC Educational Resources Information Center

    Robertson, Douglas Frederick

    1987-01-01

    Described is a computer simulation that was used to help general education students enrolled in a large introductory geology course. The purpose of the simulation is to learn to interpret incomplete data. Students design a plan to collect bathymetric data for an area of the ocean. Procedures used by the students and instructor are included.…

  13. Learner Perceptions of Realism and Magic in Computer Simulations.

    ERIC Educational Resources Information Center

    Hennessy, Sara; O'Shea, Tim

    1993-01-01

    Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive simulations; and…

  14. Preliminary Evaluation of a Computer Simulation of Long Cane Use.

    ERIC Educational Resources Information Center

    Chubon, Robert A.; Keith, Ashley D.

    1989-01-01

    Developed and evaluated long cane mobility computer simulation as visual rehabilitation training device and research tool in graduate students assigned to instruction (BI) (N=10) or enhanced instruction (EI) (N=9). Found higher percentage of EI students completed simulation task. Concluded that students registered positive understanding changes,…

  15. Enhancing Computer Science Education with a Wireless Intelligent Simulation Environment

    ERIC Educational Resources Information Center

    Cook, Diane J.; Huber, Manfred; Yerraballi, Ramesh; Holder, Lawrence B.

    2004-01-01

    The goal of this project is to develop a unique simulation environment that can be used to increase students' interest and expertise in Computer Science curriculum. Hands-on experience with physical or simulated equipment is an essential ingredient for learning, but many approaches to training develop a separate piece of equipment or software for…

  16. Plant Closings and Capital Flight: A Computer-Assisted Simulation.

    ERIC Educational Resources Information Center

    Warner, Stanley; Breitbart, Myrna M.

    1989-01-01

    A course at Hampshire College was designed to simulate the decision-making environment in which constituencies in a medium-sized city would respond to the closing and relocation of a major corporate plant. The project, constructed as a role simulation with a computer component, is described. (MLW)

  17. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  18. Design Model for Learner-Centered, Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Hawley, Chandra L.; Duffy, Thomas M.

    This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

  19. Effectiveness of an Endodontic Diagnosis Computer Simulation Program.

    ERIC Educational Resources Information Center

    Fouad, Ashraf F.; Burleson, Joseph A.

    1997-01-01

    Effectiveness of a computer simulation to teach endodontic diagnosis was assessed using three groups (n=34,32,24) of dental students. All were lectured on diagnosis, pathology, and radiographic interpretation. One group then used the simulation, another had a seminar on the same material, and the third group had no further instruction. Results…

  20. The Design, Development, and Evaluation of an Evaluative Computer Simulation.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper discusses evaluation design considerations for a computer based evaluation simulation developed at the University of Iowa College of Medicine in Cardiology to assess the diagnostic skills of primary care physicians and medical students. The simulation developed allows for the assessment of diagnostic skills of physicians in the…

  1. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    ERIC Educational Resources Information Center

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables as…

  2. Computer simulation program is adaptable to industrial processes

    NASA Technical Reports Server (NTRS)

    Schultz, F. E.

    1966-01-01

    The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.

  3. Investigating the Effectiveness of Computer Simulations for Chemistry Learning

    ERIC Educational Resources Information Center

    Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan

    2012-01-01

    Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…

  4. Simulation of Robot Kinematics Using Interactive Computer Graphics.

    ERIC Educational Resources Information Center

    Leu, M. C.; Mahajan, R.

    1984-01-01

    Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

  5. Computer Models Simulate Fine Particle Dispersion

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  6. Computer simulator for training operators of thermal cameras

    NASA Astrophysics Data System (ADS)

    Chrzanowski, Krzysztof; Krupski, Marcin

    2004-08-01

    A PC-based image generator SIMTERM developed for training operators of non-airborne military thermal imaging systems is presented in this paper. SIMTERM allows its users to generate images closely resembling thermal images of many military type targets at different scenarios obtained with the simulated thermal camera. High fidelity of simulation was achieved due to use of measurable parameters of thermal camera as input data. Two modified versions of this computer simulator developed for designers and test teams are presented, too.

  7. Computer simulations for minds-on learning with ``Project Spectra!''

    NASA Astrophysics Data System (ADS)

    Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.

    2010-12-01

    How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.

  8. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  9. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  10. Computer Simulations of Supercooled Liquids and Glasses

    NASA Astrophysics Data System (ADS)

    Kob, Walter

    Glasses are materials that are ubiquitous in our daily life. We find them in such diverse items as window pans, optical fibers, computer chips, ceramics, all of which are oxide glasses, as well as in food, foams, polymers, gels, which are mainly of organic nature. Roughly speaking glasses are solid materials that have no translational or orientational order on the scale beyond O(10) diameters of the constituent particles (atoms, colloids, …) [1]. Note that these materials are not necessarily homogeneous since, e.g., alkali-glasses such as Na2O-SiO2 show (disordered!) structural features on the length scale of 6-10 Å (compare to the interatomic distance of 1-2 Å) and gels can have structural inhomogeneities that extend up to macroscopic length scales.

  11. Estimating uncertainties in statistics computed from direct numerical simulation

    NASA Astrophysics Data System (ADS)

    Oliver, Todd A.; Malaya, Nicholas; Ulerich, Rhys; Moser, Robert D.

    2014-03-01

    Rigorous assessment of uncertainty is crucial to the utility of direct numerical simulation (DNS) results. Uncertainties in the computed statistics arise from two sources: finite statistical sampling and the discretization of the Navier-Stokes equations. Due to the presence of non-trivial sampling error, standard techniques for estimating discretization error (such as Richardson extrapolation) fail or are unreliable. This work provides a systematic and unified approach for estimating these errors. First, a sampling error estimator that accounts for correlation in the input data is developed. Then, this sampling error estimate is used as part of a Bayesian extension of Richardson extrapolation in order to characterize the discretization error. These methods are tested using the Lorenz equations and are shown to perform well. These techniques are then used to investigate the sampling and discretization errors in the DNS of a wall-bounded turbulent flow at Reτ ≈ 180. Both small (Lx/δ × Lz/δ = 4π × 2π) and large (Lx/δ × Lz/δ = 12π × 4π) domain sizes are investigated. For each case, a sequence of meshes was generated by first designing a "nominal" mesh using standard heuristics for wall-bounded simulations. These nominal meshes were then coarsened to generate a sequence of grid resolutions appropriate for the Bayesian Richardson extrapolation method. In addition, the small box case is computationally inexpensive enough to allow simulation on a finer mesh, enabling the results of the extrapolation to be validated in a weak sense. For both cases, it is found that while the sampling uncertainty is large enough to make the order of accuracy difficult to determine, the estimated discretization errors are quite small. This indicates that the commonly used heuristics provide adequate resolution for this class of problems. However, it is also found that, for some quantities, the discretization error is not small relative to sampling error, indicating that the

  12. Computer simulation of digital signal modulation techniques in satellite communications

    NASA Astrophysics Data System (ADS)

    Carlson, C. D.

    1985-09-01

    Tutorial on digital signal modulation techniques used in satellite communications is presented and includes computer simulation of those digital signal modulation techniques introduced. The purpose is to introduce digital signal modulation techniques and through the use of computer simulation, generate statistics which represent the characteristics of the FFT for the respective signal type. Further, an analysis of the statistics of the FFT's is conducted to determine if there is any relationship between the components of the FFT of the different signals. The statistic used to investigate this possible relationship is the F-distribution. The computer simulation is written and conducted in the FORTRAN programming language. A copy of the program, results of the simulation and the statistical analysis conducted are included in the appendices.

  13. A demonstrative model of a lunar base simulation on a personal computer

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.

  14. Computational Simulations and the Scientific Method

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  15. Computer simulations of adsorbed liquid crystal films

    NASA Astrophysics Data System (ADS)

    Wall, Greg D.; Cleaver, Douglas J.

    2003-01-01

    The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.

  16. Computing abstraction hierarchies by numerical simulation

    SciTech Connect

    Bundy, A.; Giunchiglia, F.; Sebastiani, R.; Walsh, T.

    1996-12-31

    We present a novel method for building ABSTRIPS-style abstraction hierarchies in planning. The aim of this method is to minimize the amount of backtracking between abstraction levels. Previous approaches have determined the criticality of operator preconditions by reasoning about plans directly. Here, we adopt a simpler and faster approach where we use numerical simulation of the planning process. We demonstrate the theoretical advantages of our approach by identifying some simple properties lacking in previous approaches but possessed by our method. We demonstrate the empirical advantages of our approach by a set of four benchmark experiments using the ABTWEAK system. We compare the quality of the abstraction hierarchies generated with those built by the ALPINE and HIGHPOINT algorithms.

  17. Osmosis : a molecular dynamics computer simulation study

    NASA Astrophysics Data System (ADS)

    Lion, Thomas

    Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..

  18. The computer scene generation for star simulator hardware-in-the-loop simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Yu, Hong; Du, Huijie; Lei, Jie

    2011-08-01

    The star sensor simulation system is used to test the star sensor performance on the ground, which is designed for star identification and spacecraft attitude determination of the spacecraft. The computer star scene based on the astronomical star chat is generated for hardware-in-the-loop simulation of the star sensor simulation system using by OpenGL.

  19. Computation simulation of the nonlinear response of suspension bridges

    SciTech Connect

    McCallen, D.B.; Astaneh-Asl, A.

    1997-10-01

    Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.

  20. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    NASA Astrophysics Data System (ADS)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.

  1. Toward Real-Time Monte Carlo Simulation Using a Commercial Cloud Computing Infrastructure+

    PubMed Central

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-01-01

    Purpose Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. Methods We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the Message Passing Interface (MPI), and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. Results The output of the cloud-based MC simulation is identical to that produced by the single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 hour on a local computer can be executed in 3.3 minutes on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Conclusion Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. PMID:21841211

  2. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Mike Maguire; Adel Sarofim; Changguan Yang; Hong-Shig Shim

    2004-01-28

    This is the thirteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused on a preliminary detailed software design for the enhanced framework. Given the complexity of the individual software tools from each team (i.e., Reaction Engineering International, Carnegie Mellon University, Iowa State University), a robust, extensible design is required for the success of the project. In addition to achieving a preliminary software design, significant progress has been made on several development tasks for the program. These include: (1) the enhancement of the controller user interface to support detachment from the Computational Engine and support for multiple computer platforms, (2) modification of the Iowa State University interface-to-kernel communication mechanisms to meet the requirements of the new software design, (3) decoupling of the Carnegie Mellon University computational models from their parent IECM (Integrated Environmental Control Model) user interface for integration with the new framework and (4) development of a new CORBA-based model interfacing specification. A benchmarking exercise to compare process and CFD based models for entrained flow gasifiers was completed. A summary of our work on intrinsic kinetics for modeling coal gasification has been completed. Plans for implementing soot and tar models into our entrained flow gasifier models are outlined. Plans for implementing a model for mercury capture based on conventional capture technology, but applied to an IGCC system, are outlined.

  3. Numerical simulation of polymer flows: A parallel computing approach

    SciTech Connect

    Aggarwal, R.; Keunings, R.; Roux, F.X.

    1993-12-31

    We present a parallel algorithm for the numerical simulation of viscoelastic fluids on distributed memory computers. The algorithm has been implemented within a general-purpose commercial finite element package used in polymer processing applications. Results obtained on the Intel iPSC/860 computer demonstrate high parallel efficiency in complex flow problems. However, since the computational load is unknown a priori, load balancing is a challenging issue. We have developed an adaptive allocation strategy which dynamically reallocates the work load to the processors based upon the history of the computational procedure. We compare the results obtained with the adaptive and static scheduling schemes.

  4. Large Scale Computer Simulation of Erthocyte Membranes

    NASA Astrophysics Data System (ADS)

    Harvey, Cameron; Revalee, Joel; Laradji, Mohamed

    2007-11-01

    The cell membrane is crucial to the life of the cell. Apart from partitioning the inner and outer environment of the cell, they also act as a support of complex and specialized molecular machinery, important for both the mechanical integrity of the cell, and its multitude of physiological functions. Due to its relative simplicity, the red blood cell has been a favorite experimental prototype for investigations of the structural and functional properties of the cell membrane. The erythrocyte membrane is a composite quasi two-dimensional structure composed essentially of a self-assembled fluid lipid bilayer and a polymerized protein meshwork, referred to as the cytoskeleton or membrane skeleton. In the case of the erythrocyte, the polymer meshwork is mainly composed of spectrin, anchored to the bilayer through specialized proteins. Using a coarse-grained model, recently developed by us, of self-assembled lipid membranes with implicit solvent and using soft-core potentials, we simulated large scale red-blood-cells bilayers with dimensions ˜ 10-1 μm^2, with explicit cytoskeleton. Our aim is to investigate the renormalization of the elastic properties of the bilayer due to the underlying spectrin meshwork.

  5. New Pedagogies on Teaching Science with Computer Simulations

    NASA Astrophysics Data System (ADS)

    Khan, Samia

    2011-06-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1) patterns in teacher-student-computer interactions, and (2) the outcome of these interactions on student learning. Using Technological Pedagogical Content Knowledge (TPCK) as a theoretical framework, analysis of the data indicates that computer simulations were employed in a unique instructional cycle across 11 topics in the science curriculum and that several teacher-developed heuristics were important to guiding the pedagogical approach. The teacher followed a pattern of "generate-evaluate-modify" (GEM) to teach chemistry, and simulation technology (T) was integrated in every stage of GEM (or T-GEM). Analysis of the student survey suggested that engagement with T-GEM enhanced conceptual understanding of chemistry. The author postulates the affordances of computer simulations and suggests T-GEM and its heuristics as an effective and viable pedagogy for teaching science with technology.

  6. Computer simulation tests of optimized neutron powder diffractometer configurations

    NASA Astrophysics Data System (ADS)

    Cussen, L. D.; Lieutenant, K.

    2016-06-01

    Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

  7. A computer-based simulator of the atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Konyaev, Petr A.

    2015-11-01

    Computer software for modeling the atmospheric turbulence is developed on the basis of a time-varying random medium simulation algorithm and a split-step Fourier transform method for solving a wave propagation equation. A judicious choice of the simulator parameters, like the velocity of the evolution and motion of the medium, turbulence spectrum and scales, enables different effects of a random medium on the optical wavefront to be simulated. The implementation of the simulation software is shown to be simple and efficient due to parallel programming functions from the MKL Intel ® Parallel Studio libraries.

  8. Executive Summary: Special Section on Credible Computational Fluid Dynamics Simulations

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.

    1998-01-01

    This summary presents the motivation for the Special Section on the credibility of computational fluid dynamics (CFD) simulations, its objective, its background and context, its content, and its major conclusions. Verification and validation (V&V) are the processes for establishing the credibility of CFD simulations. Validation assesses whether correct things are performed and verification assesses whether they are performed correctly. Various aspects of V&V are discussed. Progress is made in verification of simulation models. Considerable effort is still needed for developing a systematic validation method that can assess the credibility of simulated reality.

  9. Usage of a reconfigurable computer to simulate multiparticle systems

    NASA Astrophysics Data System (ADS)

    Fragner, Heinrich

    2007-03-01

    In this article we focus on the implementation of a Lattice Monte Carlo simulation for a generic pair potential within a reconfigurable computing platform. The approach presented was used to simulate a specific soft matter system. We found the performed simulations to be in excellent accordance with previous theoretical and simulation studies. By taking advantage of the shortened processing time, we were also able to find new micro- and macroscopic properties of this system. Furthermore we analyzed analytically the effects of the spatial discretization introduced by the Lattice Monte Carlo algorithm.

  10. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  11. Modeling and Computer Simulation: Molecular Dynamics and Kinetic Monte Carlo

    SciTech Connect

    Wirth, B.D.; Caturla, M.J.; Diaz de la Rubia, T.

    2000-10-10

    Recent years have witnessed tremendous advances in the realistic multiscale simulation of complex physical phenomena, such as irradiation and aging effects of materials, made possible by the enormous progress achieved in computational physics for calculating reliable, yet tractable interatomic potentials and the vast improvements in computational power and parallel computing. As a result, computational materials science is emerging as an important complement to theory and experiment to provide fundamental materials science insight. This article describes the atomistic modeling techniques of molecular dynamics (MD) and kinetic Monte Carlo (KMC), and an example of their application to radiation damage production and accumulation in metals. It is important to note at the outset that the primary objective of atomistic computer simulation should be obtaining physical insight into atomic-level processes. Classical molecular dynamics is a powerful method for obtaining insight about the dynamics of physical processes that occur on relatively short time scales. Current computational capability allows treatment of atomic systems containing as many as 10{sup 9} atoms for times on the order of 100 ns (10{sup -7}s). The main limitation of classical MD simulation is the relatively short times accessible. Kinetic Monte Carlo provides the ability to reach macroscopic times by modeling diffusional processes and time-scales rather than individual atomic vibrations. Coupling MD and KMC has developed into a powerful, multiscale tool for the simulation of radiation damage in metals.

  12. Computers vs. wind tunnels for aerodynamic flow simulations

    NASA Technical Reports Server (NTRS)

    Chapman, D. R.; Mark, H.; Pirtle, M. W.

    1975-01-01

    It is pointed out that in other fields of computational physics, such as ballistics, celestial mechanics, and neutronics, computations have already displaced experiments as the principal means of obtaining dynamic simulations. In the case of aerodynamic investigations, the complexity of the computational work involved in solving the Navier-Stokes equations is the reason that such investigations rely currently mainly on wind-tunnel testing. However, because of inherent limitations of the wind-tunnel approach and economic considerations, it appears that at some time in the future aerodynamic studies will chiefly rely on computational flow data provided by the computer. Taking into account projected development trends, it is estimated that computers with the required capabilities for a solution of the complete viscous, time-dependent Navier-Stokes equations will be available in the mid-1980s.

  13. A computer simulation of aircraft evacuation with fire

    NASA Technical Reports Server (NTRS)

    Middleton, V. E.

    1983-01-01

    A computer simulation was developed to assess passenger survival during the post-crash evacuation of a transport category aircraft when fire is a major threat. The computer code, FIREVAC, computes individual passenger exit paths and times to exit, taking into account delays and congestion caused by the interaction among the passengers and changing cabin conditions. Simple models for the physiological effects of the toxic cabin atmosphere are included with provision for including more sophisticated models as they become available. Both wide-body and standard-body aircraft may be simulated. Passenger characteristics are assigned stochastically from experimentally derived distributions. Results of simulations of evacuation trials and hypothetical evacuations under fire conditions are presented.

  14. Computer simulation of two-phase flow in nuclear reactors

    SciTech Connect

    Wulff, W.

    1992-09-01

    Two-phase flow models dominate the economic resource requirements for development and use of computer codes for analyzing thermohydraulic transients in nuclear power plants. Six principles are presented on mathematical modeling and selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited for two-phase flow analysis in nuclear reactors than the two-fluid model, because of the latter`s closure problem. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost.

  15. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics

    NASA Astrophysics Data System (ADS)

    John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D.

    2016-04-01

    We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems.

  16. 19 CFR 210.6 - Computation of time, additional hearings, postponements, continuances, and extensions of time.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Computation of time, additional hearings, postponements, continuances, and extensions of time. 210.6 Section 210.6 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT Rules of General Applicability §...

  17. High performance computing for domestic petroleum reservoir simulation

    SciTech Connect

    Zyvoloski, G.; Auer, L.; Dendy, J.

    1996-06-01

    This is the final report of a two-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory. High-performance computing offers the prospect of greatly increasing the resolution at which petroleum reservoirs can be represented in simulation models. The increases in resolution can be achieved through large increases in computational speed and memory, if machine architecture and numerical methods for solution of the multiphase flow equations can be used to advantage. Perhaps more importantly, the increased speed and size of today`s computers make it possible to add physical processes to simulation codes that heretofore were too expensive in terms of computer time and memory to be practical. These factors combine to allow the development of new, more accurate methods for optimizing petroleum reservoir production.

  18. KU-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Griffin, J. W.

    1980-01-01

    The preparation of a real time computer simulation model of the KU band rendezvous radar to be integrated into the shuttle mission simulator (SMS), the shuttle engineering simulator (SES), and the shuttle avionics integration laboratory (SAIL) simulator is described. To meet crew training requirements a radar tracking performance model, and a target modeling method were developed. The parent simulation/radar simulation interface requirements, and the method selected to model target scattering properties, including an application of this method to the SPAS spacecraft are described. The radar search and acquisition mode performance model and the radar track mode signal processor model are examined and analyzed. The angle, angle rate, range, and range rate tracking loops are also discussed.

  19. Dataflow computing approach in high-speed digital simulation

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Karplus, W. J.

    1984-01-01

    New computational tools and methodologies for the digital simulation of continuous systems were explored. Programmability, and cost effective performance in multiprocessor organizations for real time simulation was investigated. Approach is based on functional style languages and data flow computing principles, which allow for the natural representation of parallelism in algorithms and provides a suitable basis for the design of cost effective high performance distributed systems. The objectives of this research are to: (1) perform comparative evaluation of several existing data flow languages and develop an experimental data flow language suitable for real time simulation using multiprocessor systems; (2) investigate the main issues that arise in the architecture and organization of data flow multiprocessors for real time simulation; and (3) develop and apply performance evaluation models in typical applications.

  20. Computational simulation of high temperature metal matrix composite behavior

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Chamis, Christos C.

    1991-01-01

    Computational procedures are described to simulate the thermal and mechanical behavior of high temperature metal matrix composite (HT MMC) in the following four broad areas: (1) behavior of HT MMC from micromechanics to laminate; (2) HT MMC structural response for simple and complex structural components; (3) HT MMC microfracture; and (4) tailoring of HT MMC behavior for optimum specific performance. Representative results from each area are presented to illustrate the effectiveness of the computational simulation procedures. Relevant reports are referenced for extended discussion regarding the specific area.

  1. n-body simulations using message passing parallel computers.

    NASA Astrophysics Data System (ADS)

    Grama, A. Y.; Kumar, V.; Sameh, A.

    The authors present new parallel formulations of the Barnes-Hut method for n-body simulations on message passing computers. These parallel formulations partition the domain efficiently incurring minimal communication overhead. This is in contrast to existing schemes that are based on sorting a large number of keys or on the use of global data structures. The new formulations are augmented by alternate communication strategies which serve to minimize communication overhead. The impact of these communication strategies is experimentally studied. The authors report on experimental results obtained from an astrophysical simulation on an nCUBE2 parallel computer.

  2. Computer simulations of the explosive consolidation of powders

    SciTech Connect

    Reaugh, J.E.

    1987-07-01

    We have used computer simulations to study the explosive consolidation of powders in various experimental configurations. The objectives of these studies are (1) to help design experimental geometries that permit recovery of dense, consolidated powders, and (2) to help understand why failures occur in other geometries. We present results of our computer simulations of the geometries used by various experimenters for the consolidation of tungsten powder into rods, aluminum nitride powder into tubes and diamond powder into disks. We show that the stress histories experienced by the powders in these geometries are qualitatively different.

  3. Urban earthquake simulation of Tokyo metropolis using full K computer

    NASA Astrophysics Data System (ADS)

    Fujita, Kohei; Ichimura, Tsuyoshi; Hori, Muneo

    2016-04-01

    Reflecting detailed urban geographic information data to earthquake simulation of cities is expected to improve the reliability of damage estimates for future earthquakes. Such simulations require high resolution computation of large and complex domains and thus fast and scalable finite element solver capable of utilizing supercomputers are needed. Targeting massively parallel scalar supercomputers, we have been developing a fast low-ordered unstructured finite element solver by combining multi-precision arithmetic, multi-grid method, predictors, and techniques for utilizing multi-cores and SIMD units of CPUs. In this talk, I will show the developed method and its scalability/performance on the K computer. Together, I will show some small scale measurement results on Intel Haswell CPU servers for checking performance portability. As an application example, I will show an urban earthquake simulation targeted on a 10 km by 9 km area of central Tokyo with 320 thousand structures. Here the surface ground is modeled by 33 billion elements and 133 billion degrees-of-freedom, and its seismic response is computed using the whole K computer with 82944 compute nodes. The fast and scalable finite element method can be applied to earthquake wave propagation problems through earth crust or elastic/viscoelastic crustal deformation analyses and is expected to be useful for improving resolution of such simulations in the future.

  4. Energy Efficient Biomolecular Simulations with FPGA-based Reconfigurable Computing

    SciTech Connect

    Hampton, Scott S; Agarwal, Pratul K

    2010-05-01

    Reconfigurable computing (RC) is being investigated as a hardware solution for improving time-to-solution for biomolecular simulations. A number of popular molecular dynamics (MD) codes are used to study various aspects of biomolecules. These codes are now capable of simulating nanosecond time-scale trajectories per day on conventional microprocessor-based hardware, but biomolecular processes often occur at the microsecond time-scale or longer. A wide gap exists between the desired and achievable simulation capability; therefore, there is considerable interest in alternative algorithms and hardware for improving the time-to-solution of MD codes. The fine-grain parallelism provided by Field Programmable Gate Arrays (FPGA) combined with their low power consumption make them an attractive solution for improving the performance of MD simulations. In this work, we use an FPGA-based coprocessor to accelerate the compute-intensive calculations of LAMMPS, a popular MD code, achieving up to 5.5 fold speed-up on the non-bonded force computations of the particle mesh Ewald method and up to 2.2 fold speed-up in overall time-to-solution, and potentially an increase by a factor of 9 in power-performance efficiencies for the pair-wise computations. The results presented here provide an example of the multi-faceted benefits to an application in a heterogeneous computing environment.

  5. Computer simulation of hypersonic flow over the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Inouye, M.

    1977-01-01

    Computer simulations of the flow field around the Space Shuttle Orbiter are described. Results of inviscid calculations are presented for the shock wave pattern and bottom centerline pressure distribution at 30 deg angle of attack. Results of viscous calculations are presented for wall pressure and heat transfer distributions for simple configurations representative of regions where shock wave-boundary layer interactions occur. The computer codes are verified by comparisons with wind-tunnel data and can be applied to flight conditions.

  6. Computer simulations of ions in radio-frequency traps

    NASA Technical Reports Server (NTRS)

    Williams, A.; Prestage, J. D.; Maleki, L.; Djomehri, J.; Harabetian, E.

    1990-01-01

    The motion of ions in a trapped-ion frequency standard affects the stability of the standard. In order to study the motion and structures of large ion clouds in a radio-frequency (RF) trap, a computer simulation of the system that incorporates the effect of thermal excitation of the ions was developed. Results are presented from the simulation for cloud sizes up to 512 ions, emphasizing cloud structures in the low-temperature regime.

  7. [Computer simulated images of radiopharmaceutical distributions in anthropomorphic phantoms

    SciTech Connect

    Not Available

    1991-05-17

    We have constructed an anatomically correct human geometry, which can be used to store radioisotope concentrations in 51 various internal organs. Each organ is associated with an index number which references to its attenuating characteristics (composition and density). The initial development of Computer Simulated Images of Radiopharmaceuticals in Anthropomorphic Phantoms (CSIRDAP) over the first 3 years has been very successful. All components of the simulation have been coded, made operational and debugged.

  8. Quantum computer simulation using the CUDA programming model

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Eladio; Romero, Sergio; Trenas, María A.; Zapata, Emilio L.

    2010-02-01

    Quantum computing emerges as a field that captures a great theoretical interest. Its simulation represents a problem with high memory and computational requirements which makes advisable the use of parallel platforms. In this work we deal with the simulation of an ideal quantum computer on the Compute Unified Device Architecture (CUDA), as such a problem can benefit from the high computational capacities of Graphics Processing Units (GPU). After all, modern GPUs are becoming very powerful computational architectures which is causing a growing interest in their application for general purpose. CUDA provides an execution model oriented towards a more general exploitation of the GPU allowing to use it as a massively parallel SIMT (Single-Instruction Multiple-Thread) multiprocessor. A simulator that takes into account memory reference locality issues is proposed, showing that the challenge of achieving a high performance depends strongly on the explicit exploitation of memory hierarchy. Several strategies have been experimentally evaluated obtaining good performance results in comparison with conventional platforms.

  9. A Computer Simulation of Community Pharmacy Practice for Educational Use

    PubMed Central

    Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert

    2014-01-01

    Objective. To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. Design. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Assessment. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Conclusion. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor. PMID:26056406

  10. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  11. Incorporation of shuttle CCT parameters in computer simulation models

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry

    1990-01-01

    Computer simulations of shuttle missions have become increasingly important during recent years. The complexity of mission planning for satellite launch and repair operations which usually involve EVA has led to the need for accurate visibility and access studies. The PLAID modeling package used in the Man-Systems Division at Johnson currently has the necessary capabilities for such studies. In addition, the modeling package is used for spatial location and orientation of shuttle components for film overlay studies such as the current investigation of the hydrogen leaks found in the shuttle flight. However, there are a number of differences between the simulation studies and actual mission viewing. These include image blur caused by the finite resolution of the CCT monitors in the shuttle and signal noise from the video tubes of the cameras. During the course of this investigation the shuttle CCT camera and monitor parameters are incorporated into the existing PLAID framework. These parameters are specific for certain camera/lens combinations and the SNR characteristics of these combinations are included in the noise models. The monitor resolution is incorporated using a Gaussian spread function such as that found in the screen phosphors in the shuttle monitors. Another difference between the traditional PLAID generated images and actual mission viewing lies in the lack of shadows and reflections of light from surfaces. Ray tracing of the scene explicitly includes the lighting and material characteristics of surfaces. The results of some preliminary studies using ray tracing techniques for the image generation process combined with the camera and monitor effects are also reported.

  12. A Computational Workbench Environment For Virtual Power Plant Simulation

    SciTech Connect

    Bockelie, Michael J.; Swensen, David A.; Denison, Martin K.; Sarofim, Adel F.

    2001-11-06

    In this paper we describe our progress toward creating a computational workbench for performing virtual simulations of Vision 21 power plants. The workbench provides a framework for incorporating a full complement of models, ranging from simple heat/mass balance reactor models that run in minutes to detailed models that can require several hours to execute. The workbench is being developed using the SCIRun software system. To leverage a broad range of visualization tools the OpenDX visualization package has been interfaced to the workbench. In Year One our efforts have focused on developing a prototype workbench for a conventional pulverized coal fired power plant. The prototype workbench uses a CFD model for the radiant furnace box and reactor models for downstream equipment. In Year Two and Year Three, the focus of the project will be on creating models for gasifier based systems and implementing these models into an improved workbench. In this paper we describe our work effort for Year One and outline our plans for future work. We discuss the models included in the prototype workbench and the software design issues that have been addressed to incorporate such a diverse range of models into a single software environment. In addition, we highlight our plans for developing the energyplex based workbench that will be developed in Year Two and Year Three.

  13. AKSATINT - SATELLITE INTERFERENCE ANALYSIS AND SIMULATION USING PERSONAL COMPUTERS

    NASA Technical Reports Server (NTRS)

    Kantak, A.

    1994-01-01

    In the late seventies, the number of communication satellites in service increased, and interference has become an increasingly important consideration in designing satellite/ground station communications systems. Satellite Interference Analysis and Simulation Using Personal Computers, AKSATINT, models the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both the desired and the interfering satellites are considered to be in elliptical orbits. The simulation contains computation of orbital positions of both satellites using classical orbital elements, calculation of the satellite antennae look angles for both satellites and elevation angles at the desired-satellite ground-station antenna, and computation of Doppler effect due to the motions of the satellites and the Earth's rotation. AKSATINT also computes the interference-tosignal-power ratio, taking into account losses suffered by the links. After computing the interference-to-signal-power ratio, the program computes the statistical quantities. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. The program includes a flowchart, a sample run, and results of that run. AKSATINT is expected to be of general use to system designers and frequency managers in selecting the proper frequency under an interference scenario. The AKSATINT program is written in BASIC. It was designed to operate on the IBM Personal Computer AT or compatibles, and has been implemented under MS DOS 3.2. AKSATINT was developed in 1987.

  14. Computer simulation of plasma and N-body problems

    NASA Technical Reports Server (NTRS)

    Harries, W. L.; Miller, J. B.

    1975-01-01

    The following FORTRAN language computer codes are presented: (1) efficient two- and three-dimensional central force potential solvers; (2) a three-dimensional simulator of an isolated galaxy which incorporates the potential solver; (3) a two-dimensional particle-in-cell simulator of the Jeans instability in an infinite self-gravitating compressible gas; and (4) a two-dimensional particle-in-cell simulator of a rotating self-gravitating compressible gaseous system of which rectangular coordinate and superior polar coordinate versions were written.

  15. Computer simulation of Aphis gossypii insects using Penna aging model

    NASA Astrophysics Data System (ADS)

    Giarola, L. T. P.; Martins, S. G. F.; Toledo Costa, M. C. P.

    2006-08-01

    A computer simulation was made for the population dynamics of Aphis gossypii in laboratory and field conditions. The age structure was inserted in the dynamics through bit string model for biological aging, proposed by Penna in 1995. The influence of different host plants and of climatic factors such as temperature and precipitation was considered in the simulation starting from experimental data. The results obtained indicate that the simulation is an appropriate instrument for understanding of the population dynamics of these species and for the establishment of biological control strategies.

  16. Computer simulation of multigrid body dynamics and control

    NASA Technical Reports Server (NTRS)

    Swaminadham, M.; Moon, Young I.; Venkayya, V. B.

    1990-01-01

    The objective is to set up and analyze benchmark problems on multibody dynamics and to verify the predictions of two multibody computer simulation codes. TREETOPS and DISCOS have been used to run three example problems - one degree-of-freedom spring mass dashpot system, an inverted pendulum system, and a triple pendulum. To study the dynamics and control interaction, an inverted planar pendulum with an external body force and a torsional control spring was modeled as a hinge connected two-rigid body system. TREETOPS and DISCOS affected the time history simulation of this problem. System state space variables and their time derivatives from two simulation codes were compared.

  17. Additive Manufacturing Modeling and Simulation A Literature Review for Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seufzer, William J.

    2014-01-01

    Additive manufacturing is coming into industrial use and has several desirable attributes. Control of the deposition remains a complex challenge, and so this literature review was initiated to capture current modeling efforts in the field of additive manufacturing. This paper summarizes about 10 years of modeling and simulation related to both welding and additive manufacturing. The goals were to learn who is doing what in modeling and simulation, to summarize various approaches taken to create models, and to identify research gaps. Later sections in the report summarize implications for closed-loop-control of the process, implications for local research efforts, and implications for local modeling efforts.

  18. Improving wound care simulation with the addition of odor: a descriptive, quasi-experimental study.

    PubMed

    Roberson, Donna W; Neil, Janice A; Bryant, Elizabeth T

    2008-08-01

    Improving problem-solving skills and expertise in complex clinical care provision requires engaging students in the learning process--a challenging goal when clinical practicums and supervisors are limited. High-fidelity simulation has created many new opportunities for educating healthcare professionals. Because addressing malodorous wounds is a common problem that may be difficult to "teach," a descriptive, quasi-experimental simulation study was conducted. Following completion of a wound care simulation and Laerdal's Simulation Experience Evaluation Tool by 137 undergraduate nursing students, 50 control subjects were randomly selected and 49 volunteer students (experimental group) participated in a wound care simulation after one of three cheeses with a strong odor was added to simulate a malodorous wound. Compared to the control group, study group responses were significantly better (P <0.001) for eight of the 12 survey variables tested and indicated the addition of odor was beneficial in enhancing the perceived realism and value of the simulation. Students responded that the addition of odor in the simulation laboratory improved realism and they felt better prepared to handle malodorous wounds in a clinical setting. An unanticipated outcome was the enhanced feeling of involvement associated with paired care teams as opposed to working in larger groups. The results of this study indicate that wound care education outcomes improve when nursing students are able to practice using a multi-sensorial wound care simulation model. PMID:18716340

  19. Computational simulation of periodic thermal processes in the roof deck

    NASA Astrophysics Data System (ADS)

    Stastnik, S.

    2016-06-01

    The climate changes in the Central Europe highlight the importance of protection of buildings against overheating during summer season, with the crucial thermal insulation and accumulation role of the roof deck. This paper demonstrates the possibility of computational simulation of such periodic thermal processes, applying the evaluation of thermal attenuation using complex arithmetics, in confrontation with real experimental data.

  20. Time Advice and Learning Questions in Computer Simulations

    ERIC Educational Resources Information Center

    Rey, Gunter Daniel

    2011-01-01

    Students (N = 101) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without time advice) x 3 (with learning questions and corrective feedback, with…

  1. Role of Computer Graphics in Simulations for Teaching Physiology.

    ERIC Educational Resources Information Center

    Modell, H. I.; And Others

    1983-01-01

    Discusses a revision of existing respiratory physiology simulations to promote active learning experiences for individual students. Computer graphics were added to aid student's conceptualization of the physiological system. Specific examples are provided, including those dealing with alveolar gas equations and effects of anatomic shunt flow on…

  2. Effectiveness of Computer Simulation for Enhancing Higher Order Thinking.

    ERIC Educational Resources Information Center

    Gokhale, Anu A.

    1996-01-01

    Electronics students (16 controls, 16 experimentals) designed, built, and tested an amplifier. The experimentals did so after it was designed through computer simulation (using Electronics Workbench software). The experimental group performed significantly better on problem-solving tests; both groups did the same on drill and practice tests. (SK)

  3. Faster quantum chemistry simulation on fault-tolerant quantum computers

    NASA Astrophysics Data System (ADS)

    Cody Jones, N.; Whitfield, James D.; McMahon, Peter L.; Yung, Man-Hong; Van Meter, Rodney; Aspuru-Guzik, Alán; Yamamoto, Yoshihisa

    2012-11-01

    Quantum computers can in principle simulate quantum physics exponentially faster than their classical counterparts, but some technical hurdles remain. We propose methods which substantially improve the performance of a particular form of simulation, ab initio quantum chemistry, on fault-tolerant quantum computers; these methods generalize readily to other quantum simulation problems. Quantum teleportation plays a key role in these improvements and is used extensively as a computing resource. To improve execution time, we examine techniques for constructing arbitrary gates which perform substantially faster than circuits based on the conventional Solovay-Kitaev algorithm (Dawson and Nielsen 2006 Quantum Inform. Comput. 6 81). For a given approximation error ɛ, arbitrary single-qubit gates can be produced fault-tolerantly and using a restricted set of gates in time which is O(log ɛ) or O(log log ɛ) with sufficient parallel preparation of ancillas, constant average depth is possible using a method we call programmable ancilla rotations. Moreover, we construct and analyze efficient implementations of first- and second-quantized simulation algorithms using the fault-tolerant arbitrary gates and other techniques, such as implementing various subroutines in constant time. A specific example we analyze is the ground-state energy calculation for lithium hydride.

  4. Simulations Using a Computer/Videodisc System: Instructional Design Considerations.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    Instructional design considerations involved in using level four videodisc systems when designing simulations are explored. Discussion of the hardware and software system characteristics notes that computer based training offers the features of text, graphics, color, animation, and highlighting techniques, while a videodisc player offers all of…

  5. Social Choice in a Computer-Assisted Simulation

    ERIC Educational Resources Information Center

    Thavikulwat, Precha

    2009-01-01

    Pursuing a line of inquiry suggested by Crookall, Martin, Saunders, and Coote, the author applied, within the framework of design science, an optimal-design approach to incorporate into a computer-assisted simulation two innovative social choice processes: the multiple period double auction and continuous voting. Expectations that the…

  6. Integrating Computer Simulations into High School Physics Teaching.

    ERIC Educational Resources Information Center

    Ronen, Miky; And Others

    1992-01-01

    Describes a project aimed at examining the problems involved in a large scale integration of computerized simulations into the present structure of physics teaching in Israeli high schools. Assessment by students, teachers, and the project team indicated general satisfaction with the project. Discusses difficulties of implementing computer-based…

  7. A Computer Simulated Experiment in Complex Order Kinetics

    ERIC Educational Resources Information Center

    Merrill, J. C.; And Others

    1975-01-01

    Describes a computer simulation experiment in which physical chemistry students can determine all of the kinetic parameters of a reaction, such as order of the reaction with respect to each reagent, forward and reverse rate constants for the overall reaction, and forward and reverse activation energies. (MLH)

  8. Instructional Advice, Time Advice and Learning Questions in Computer Simulations

    ERIC Educational Resources Information Center

    Rey, Gunter Daniel

    2010-01-01

    Undergraduate students (N = 97) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without instructional advice) x 2 (with or without time advice) x 2…

  9. Biology Students Building Computer Simulations Using StarLogo TNG

    ERIC Educational Resources Information Center

    Smith, V. Anne; Duncan, Ishbel

    2011-01-01

    Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…

  10. Accelerating sino-atrium computer simulations with graphic processing units.

    PubMed

    Zhang, Hong; Xiao, Zheng; Lin, Shien-fong

    2015-01-01

    Sino-atrial node cells (SANCs) play a significant role in rhythmic firing. To investigate their role in arrhythmia and interactions with the atrium, computer simulations based on cellular dynamic mathematical models are generally used. However, the large-scale computation usually makes research difficult, given the limited computational power of Central Processing Units (CPUs). In this paper, an accelerating approach with Graphic Processing Units (GPUs) is proposed in a simulation consisting of the SAN tissue and the adjoining atrium. By using the operator splitting method, the computational task was made parallel. Three parallelization strategies were then put forward. The strategy with the shortest running time was further optimized by considering block size, data transfer and partition. The results showed that for a simulation with 500 SANCs and 30 atrial cells, the execution time taken by the non-optimized program decreased 62% with respect to a serial program running on CPU. The execution time decreased by 80% after the program was optimized. The larger the tissue was, the more significant the acceleration became. The results demonstrated the effectiveness of the proposed GPU-accelerating methods and their promising applications in more complicated biological simulations. PMID:26406070

  11. Interview and Interrogation Training using a Computer-Simulated Subject.

    ERIC Educational Resources Information Center

    Olsen, Dale E.

    Interactive, multimedia software involving a simulated subject has been created to help trainees develop interview and interrogation techniques using personal computers, because practice interviews are not always realistic and are too expensive. New and experienced law enforcement agents, among others, need such extensive training in techniques…

  12. Interaction of Intuitive Physics with Computer-Simulated Physics.

    ERIC Educational Resources Information Center

    Flick, Lawrence B.

    1990-01-01

    The question of how children solve force and motion problems in computer simulations without explicit knowledge of the underlying physics was investigated. Keystroke sequences made by children were saved and analyzed, and children were interviewed to understand their perception of the relationship between keyboard input and on-screen action. (CW)

  13. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  14. Improving a Computer Networks Course Using the Partov Simulation Engine

    ERIC Educational Resources Information Center

    Momeni, B.; Kharrazi, M.

    2012-01-01

    Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

  15. Computational Modelling and Simulation Fostering New Approaches in Learning Probability

    ERIC Educational Resources Information Center

    Kuhn, Markus; Hoppe, Ulrich; Lingnau, Andreas; Wichmann, Astrid

    2006-01-01

    Discovery learning in mathematics in the domain of probability based on hands-on experiments is normally limited because of the difficulty in providing sufficient materials and data volume in terms of repetitions of the experiments. Our cooperative, computational modelling and simulation environment engages students and teachers in composing and…

  16. Interaction of intuitive physics with computer-simulated physics

    NASA Astrophysics Data System (ADS)

    Flick, Lawrence B.

    How do children solve force and motion problems in computer simulations without explicit knowledge of the underlying physics? This question was addressed by saving the keystroke input of 19 sixth-grade children in computer memory as each interacted with a simulated, frictionless object using Logo turtle-graphics. The keystroke sequences were first used to determine subject performance on the gamelike features of the simulation. A second analysis used the Newtonian structure of the program to investigate alternative methods for controlling turtle velocity. Five boys and five girls were interviewed during the simulation concerning the perceived relationship between keyboard input and turtle behavior. Subjects who could clearly state some keyboard effects did not score high on either computer analysis, yet achieved the most general solutions of the computer problem. They did so by exploring turtle behavior under a greater variety of conditions than the subjects who achieved partial solutions. For the successful subjects, the turtle was related by analogy to useful information from existing conceptions of motion.

  17. Teaching Physics (and Some Computation) Using Intentionally Incorrect Simulations

    ERIC Educational Resources Information Center

    Cox, Anne J.; Junkin, William F., III; Christian, Wolfgang; Belloni, Mario; Esquembre, Francisco

    2011-01-01

    Computer simulations are widely used in physics instruction because they can aid student visualization of abstract concepts, they can provide multiple representations of concepts (graphical, trajectories, charts), they can approximate real-world examples, and they can engage students interactively, all of which can enhance student understanding of…

  18. Computation techniques for the simulation of turbomachinery compressible flows

    NASA Astrophysics Data System (ADS)

    Veuillot, J. P.; Cambier, L.

    Computation techniques for the simulation of turbomachinery compressible flows via the numerical solution of Euler and Navier-Stokes equations are described. In a discussion of the Euler and Navier-Stokes equations for turbomachinery flow calculations, attention is given to equations for a rotating system, quasi-three-dimensional formulation, and turbulence modeling. Examples of Navier-Stokes calculations are presented.

  19. A Computer Simulation for the Study of Waves.

    ERIC Educational Resources Information Center

    Bork, Alfred M.; Robson, John

    A computer program, designed for use in the second quarter of the beginning course for science and engineering majors at the University of California, Irvine, simulates an experimental investigation of a pulse in a rope. A full trial run is given, in which the student's problem is to discover enough about the disturbance of the rope to answer…

  20. Computers With Wings: Flight Simulation and Personalized Landscapes

    NASA Astrophysics Data System (ADS)

    Oss, Stefano

    2005-03-01

    We propose, as a special way to explore the physics of flying objects, to use a flight simulator with a personalized scenery to reproduce the territory where students live. This approach increases the participation and attention of students to physics classes but also creates several opportunities for addressing side activities and arguments of various nature, from history to geography, computer science, and much more.

  1. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; And Others

    1986-01-01

    Describes possible applications of new technologies to special education. Discusses results of a study designed to explore the use of robotics, artificial intelligence, and computer simulations to aid people with handicapping conditions. Presents several scenarios in which specific technological advances may contribute to special education…

  2. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; And Others

    The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…

  3. Computational Simulation of a Water-Cooled Heat Pump

    NASA Technical Reports Server (NTRS)

    Bozarth, Duane

    2008-01-01

    A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).

  4. Computer Simulation and Laboratory Work in the Teaching of Mechanics.

    ERIC Educational Resources Information Center

    Borghi, L.; And Others

    1987-01-01

    Describes a teaching strategy designed to help high school students learn mechanics by involving them in simple experimental work, observing didactic films, running computer simulations, and executing more complex laboratory experiments. Provides an example of the strategy as it is applied to the topic of projectile motion. (TW)

  5. Symbolic Quantum Computation Simulation in SymPy

    NASA Astrophysics Data System (ADS)

    Cugini, Addison; Curry, Matt; Granger, Brian

    2010-10-01

    Quantum computing is an emerging field which aims to use quantum mechanics to solve difficult computational problems with greater efficiency than on a classical computer. There is a need to create software that i) helps newcomers to learn the field, ii) enables practitioners to design and simulate quantum circuits and iii) provides an open foundation for further research in the field. Towards these ends we have created a package, in the open-source symbolic computation library SymPy, that simulates the quantum circuit model of quantum computation using Dirac notation. This framework builds on the extant powerful symbolic capabilities of SymPy to preform its simulations in a fully symbolic manner. We use object oriented design to abstract circuits as ordered collections of quantum gate and qbit objects. The gate objects can either be applied directly to the qbit objects or be represented as matrices in different bases. The package is also capable of performing the quantum Fourier transform and Shor's algorithm. A notion of measurement is made possible through the use of a non-commutative gate object. In this talk, we describe the software and show examples of quantum circuits on single and multi qbit states that involve common algorithms, gates and measurements.

  6. Computer Simulation Of Radiographic Screen-Film Images

    NASA Astrophysics Data System (ADS)

    Metter, Richard V.; Dillon, Peter L.; Huff, Kenneth E.; Rabbani, Majid

    1986-06-01

    A method is described for computer simulation of radiographic screen-film images. This method is based on a previously published model of the screen-film imaging process.l The x-ray transmittance of a test object is sampled at a pitch of 50 μm by scanning a high-resolution, low-noise direct-exposure radiograph. This transmittance is then used, along with the x-ray exposure incident upon the object, to determine the expected number of quanta per pixel incident upon the screen. The random nature of x-ray arrival and absorption, x-ray quantum to light photon conversion, and photon absorption by the film is simulated by appropriate random number generation. Standard FFT techniques are used for computing the effects of scattering. Finally, the computed film density for each pixel is produced on a high-resolution, low-noise output film by a scanning printer. The simulation allows independent specification of x-ray exposure, x-ray quantum absorption, light conversion statistics, light scattering, and film characteristics (sensitometry and gran-ularity). Each of these parameters is independently measured for radiographic systems of interest. The simulator is tested by comparing actual radiographic images with simulated images resulting from the independently measured parameters. Images are also shown illustrating the effects of changes in these parameters on image quality. Finally, comparison is made with a "perfect" imaging system where information content is only limited by the finite number of x-rays.

  7. Bibliography for Verification and Validation in Computational Simulations

    SciTech Connect

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  8. Optimizing Quantum Simulation for Heterogeneous Computing: a Hadamard Transformation Study

    NASA Astrophysics Data System (ADS)

    de Avila, Anderson B.; Schumalfuss, Murilo F.; Reiser, Renata H. S.; Pilla, Mauricio L.; Maron, Adriano K.

    2015-10-01

    The D-GM execution environment improves distributed simulation of quantum algorithms in heterogeneous computing environments comprising both multi-core CPUs and GPUs. The main contribution of this work consists in the optimization of the environment VirD-GM, conceived in three steps: (i) the theoretical studies and implementation of the abstractions of the Mixed Partial Process defined in the qGM model, focusing on the reduction of the memory consumption regarding multidimensional QTs; (ii) the distributed/parallel implementation of such abstractions allowing its execution on clusters of GPUs; (iii) and optimizations that predict multiplications by zero-value of the quantum states/transformations, implying reduction in the number of computations. The results obtained in this work embrace the distribute/parallel simulation of Hadamard gates up to 21 qubits, showing scalability with the increase in the number of computing nodes.

  9. Computational Simulation of the Formation and Material Behavior of Ice

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Computational methods are described for simulating the formation and the material behavior of ice in prevailing transient environments. The methodology developed at the NASA Lewis Research Center was adopted. A three dimensional finite-element heat transfer analyzer was used to predict the thickness of ice formed under prevailing environmental conditions. A multi-factor interaction model for simulating the material behavior of time-variant ice layers is presented. The model, used in conjunction with laminated composite mechanics, updates the material properties of an ice block as its thickness increases with time. A sample case of ice formation in a body of water was used to demonstrate the methodology. The results showed that the formation and the material behavior of ice can be computationally simulated using the available composites technology.

  10. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    NASA Astrophysics Data System (ADS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual

  11. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    SciTech Connect

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  12. Computational fluid dynamics simulations of oscillating wings and comparison to lifting-line theory

    NASA Astrophysics Data System (ADS)

    Keddington, Megan

    Computational fluid dynamics (CFD) analysis was performed in order to compare the solutions of oscillating wings with Prandtl's lifting-line theory. Quasi-steady and steady-periodic simulations were completed using the CFD software Star-CCM+. The simulations were performed for a number of frequencies in a pure plunging setup. Additional simulations were then completed using a setup of combined pitching and plunging at multiple frequencies. Results from the CFD simulations were compared to the quasi-steady lifting-line solution in the form of the axial-force, normal-force, power, and thrust coefficients, as well as the efficiency obtained for each simulation. The mean values were evaluated for each simulation and compared to the quasi-steady lifting-line solution. It was found that as the frequency of oscillation increased, the quasi-steady lifting-line solution was decreasingly accurate in predicting solutions.

  13. Computer-intensive simulation of solid-state NMR experiments using SIMPSON

    NASA Astrophysics Data System (ADS)

    Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr.; Vosegaard, Thomas

    2014-09-01

    Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations.

  14. Computational simulation for analysis and synthesis of impact resilient structure

    NASA Astrophysics Data System (ADS)

    Djojodihardjo, Harijono

    2013-10-01

    Impact resilient structures are of great interest in many engineering applications varying from civil, land vehicle, aircraft and space structures, to mention a few examples. To design such structure, one has to resort fundamental principles and take into account progress in analytical and computational approaches as well as in material science and technology. With such perspectives, this work looks at a generic beam and plate structure subject to impact loading and carry out analysis and numerical simulation. The first objective of the work is to develop a computational algorithm to analyze flat plate as a generic structure subjected to impact loading for numerical simulation and parametric study. The analysis will be based on dynamic response analysis. Consideration is given to the elastic-plastic region. The second objective is to utilize the computational algorithm for direct numerical simulation, and as a parallel scheme, commercial off-the shelf numerical code is utilized for parametric study, optimization and synthesis. Through such analysis and numerical simulation, effort is devoted to arrive at an optimum configuration in terms of loading, structural dimensions, material properties and composite lay-up, among others. Results will be discussed in view of practical applications.

  15. Cosmic reionization on computers. I. Design and calibration of simulations

    SciTech Connect

    Gnedin, Nickolay Y.

    2014-09-20

    Cosmic Reionization On Computers is a long-term program of numerical simulations of cosmic reionization. Its goal is to model fully self-consistently (albeit not necessarily from the first principles) all relevant physics, from radiative transfer to gas dynamics and star formation, in simulation volumes of up to 100 comoving Mpc, and with spatial resolution approaching 100 pc in physical units. In this method paper, we describe our numerical method, the design of simulations, and the calibration of numerical parameters. Using several sets (ensembles) of simulations in 20 h {sup –1} Mpc and 40 h {sup –1} Mpc boxes with spatial resolution reaching 125 pc at z = 6, we are able to match the observed galaxy UV luminosity functions at all redshifts between 6 and 10, as well as obtain reasonable agreement with the observational measurements of the Gunn-Peterson optical depth at z < 6.

  16. Computation of surface tensions using expanded ensemble simulations.

    PubMed

    de Miguel, Enrique

    2008-04-17

    A method for the direct simulation of the surface tension is examined. The technique is based on the thermodynamic route to the interfacial tension and makes use of the expanded ensemble simulation method for the calculation of the free energy difference between two inhomogeneous systems with the same number of particles, temperature, and volume, but different interfacial area. The method is completely general and suitable for systems with either continuous or discontinuous interactions. The adequacy of the expanded ensemble method is assessed by computing the interfacial tension of the planar vapor-liquid interface of Lennard-Jones, Lennard-Jones dimers, Gay-Berne, and square-well model fluids; in the latter, the interactions are discontinuous and the present method does not exhibit the asymmetry of other related methods, such as the test area. The expanded ensemble simulation results are compared with simulation data obtained from other techniques (mechanical and test area) with overall good agreement. PMID:18358023

  17. CAD-based graphical computer simulation in endoscopic surgery.

    PubMed

    Kuehnapfel, U G; Neisius, B

    1993-06-01

    This article presents new techniques for three-dimensional, kinematic realtime simulation of dextrous endoscopic instruments. The integrated simulation package KISMET is used for engineering design verification and evaluation. Geometric and kinematic computer models of the mechanisms and the laparoscopic workspace were created. Using the advanced capabilities of high-performance graphical workstations combined with state-of-the-art simulation software, it is possible to generate displays of the surgical instruments acting realistically on the organs of the digestive system. The organ geometry is modelled in a high degree of detail. Apart from discussing the use of KISMET for the development of MFM-II (Modular Flexible MIS Instrument, Release II), the paper indicates further applications of realtime 3D graphical simulation methods in endoscopic surgery. PMID:8055320

  18. Computer simulations of realistic microstructures: Implications for simulation-based materials design

    NASA Astrophysics Data System (ADS)

    Singh, Harpreet

    The conventional route of materials development typically involves fabrication of numerous batches of specimens having a range of different microstructures generated via variations of process parameters and measurements of relevant properties of these microstructures to identify the combination of processing conditions that yield the material having desired properties. Clearly, such a trial and error based materials development methodology is expensive, time consuming, and inefficient. Consequently, it is of interest to explore alternate strategies that can lead to a decrease in the cost and time required for development of advanced materials such as composites. Availability of powerful and inexpensive computational power and progress in computational materials science permits advancement of modeling and simulations assisted materials design methodology that may require fewer experiments, and therefore, lower cost and time for materials development. The key facets of such a technology would be computational tools for (i) creating models to generate computer simulated realistic microstructures; (ii) capturing the process-microstructure relationship using these models; and (iii) implementation of simulated microstructures in the computational models for materials behavior. Therefore, development of a general and flexible methodology for simulations of realistic microstructures is crucial for the development of simulations based materials design and development technology. Accordingly, this research concerns development of such a methodology for simulations of realistic microstructures based on experimental quantitative stereological data on few microstructures that can capture relevant details of microstructural geometry (including spatial clustering and second phase particle orientations) and its variations with process parameters in terms of a set of simulation parameters. The interpolation and extrapolation of the simulation parameters can then permit generation

  19. Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2010-01-01

    Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.

  20. QDENSITY/QCWAVE: A Mathematica quantum computer simulation update

    NASA Astrophysics Data System (ADS)

    Tabakin, Frank

    2016-04-01

    The Mathematica quantum computer simulation packages QDENSITY and QCWAVE are updated for Mathematica 9-10.3. An overview is given of the new QDensity, QCWave, BTSystem and Circuits packages, which includes: (1) improved treatment of tensor products of states and density matrices, (2) major extension to include qutrit (triplet), as well as qubit (binary) and hybrid qubit/qutrit systems in the associated BTSystem package, (3) updated sample quantum computation algorithms, (4) entanglement studies, including Schmidt decomposition, entropy, mutual information, partial transposition, and calculation of the quantum discord. Examples of Bell's theorem and concurrence are also included. This update will hopefully aid in studies of QC dynamics.

  1. Computational simulations and experimental validation of a furnace brazing process

    SciTech Connect

    Hosking, F.M.; Gianoulakis, S.E.; Malizia, L.A.

    1998-12-31

    Modeling of a furnace brazing process is described. The computational tools predict the thermal response of loaded hardware in a hydrogen brazing furnace to programmed furnace profiles. Experiments were conducted to validate the model and resolve computational uncertainties. Critical boundary conditions that affect materials and processing response to the furnace environment were determined. {open_quotes}Global{close_quotes} and local issues (i.e., at the furnace/hardware and joint levels, respectively) are discussed. The ability to accurately simulate and control furnace conditions is examined.

  2. The very local Hubble flow: Computer simulations of dynamical history

    NASA Astrophysics Data System (ADS)

    Chernin, A. D.; Karachentsev, I. D.; Valtonen, M. J.; Dolgachev, V. P.; Domozhilova, L. M.; Makarov, D. I.

    2004-02-01

    The phenomenon of the very local (≤3 Mpc) Hubble flow is studied on the basis of the data of recent precision observations. A set of computer simulations is performed to trace the trajectories of the flow galaxies back in time to the epoch of the formation of the Local Group. It is found that the ``initial conditions'' of the flow are drastically different from the linear velocity-distance relation. The simulations enable one also to recognize the major trends of the flow evolution and identify the dynamical role of universal antigravity produced by the cosmic vacuum.

  3. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    SciTech Connect

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  4. Computer simulations of the motion and decay of global strings

    SciTech Connect

    Hagmann, C.; Sikivie, P.

    1990-01-01

    Computer simulations have been carried out of the motion and decay of global strings, including spectrum analysis of the energy stored in the scalar field which describes the global string and the radiated Nambu-Goldstone bosons. We simulated relaxing pieces of bent string and collapsing loops. We find, for the string sizes investigated, that the spectrum of field energy hardens rather than softens while the string decays into Nambu-Goldstone radiation. We argue on theoretical grounds that is indeed the most plausible general behaviour. 19 refs., 12 figs.

  5. A computer simulation approach to measurement of human control strategy

    NASA Technical Reports Server (NTRS)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  6. A computer program for scanning transmission ion microscopy simulation

    NASA Astrophysics Data System (ADS)

    Wu, R.; Shen, H.; Mi, Y.; Sun, M. D.; Yang, M. J.

    2005-04-01

    With the installation of the Scanning Proton Microprobe system at Fudan University, we are in the process of developing a three-dimension reconstruction technique based on scanning transmission ion microscopy-computed tomography (STIM-CT). As the first step, a related computer program of STIM simulation has been established. This program is written in the Visual C++®, using the technique of OOP (Object Oriented Programming) and it is a standard multiple-document Windows® program. It can be run with all MS Windows® operating systems. The operating mode is the menu mode, using a multiple process technique. The stopping power theory is based on the Bethe-Bloch formula. In order to simplify the calculation, the improved cylindrical coordinate model was introduced in the program instead of a usual spherical or cylindrical coordinate model. The simulated results of a sample at several rotation angles are presented.

  7. Using Computer Simulation for Neurolab 2 Mission Planning

    NASA Technical Reports Server (NTRS)

    Sanders, Betty M.

    1997-01-01

    This paper presents an overview of the procedure used in the creation of a computer simulation video generated by the Graphics Research and Analysis Facility at NASA/Johnson Space Center. The simulation was preceded by an analysis of anthropometric characteristics of crew members and workspace requirements for 13 experiments to be conducted on Neurolab 2 which is dedicated to neuroscience and behavioral research. Neurolab 2 is being carried out as a partnership among national domestic research institutes and international space agencies. The video is a tour of the Spacelab module as it will be configured for STS-90, scheduled for launch in the spring of 1998, and identifies experiments that can be conducted in parallel during that mission. Therefore, this paper will also address methods for using computer modeling to facilitate the mission planning activity.

  8. Development of computer simulations for landfill methane recovery

    SciTech Connect

    Massmann, J.W.; Moore, C.A.; Sykes, R.M.

    1981-12-01

    Two- and three-dimensional finite-difference computer programs simulating methane recovery systems in landfills have been developed. These computer programs model multicomponent combined pressure and diffusional flow in porous media. Each program and the processes it models are described in this report. Examples of the capabilities of each program are also presented. The two-dimensional program was used to simulate methane recovery systems in a cylindrically shaped landfill. The effects of various pump locations, geometries, and extraction rates were determined. The three-dimensional program was used to model the Puente Hills landfill, a field test site in southern California. The biochemical and microbiological details of methane generation in landfills are also given. Effects of environmental factors, such as moisture, oxygen, temperature, and nutrients on methane generation are discussed and an analytical representation of the gas generation rate is developed.

  9. Computer Simulation for Air-coupled Ultrasonic Testing

    NASA Astrophysics Data System (ADS)

    Yamawaki, H.

    2014-06-01

    Air-coupled ultrasound is used as non-contact ultrasonic testing method. For wider application of air-coupled ultrasonic technique, it is required to know situation of ultrasonic propagation between air and solid. Transmittance of the ultrasonic waves from air to solids is extremely small with 10-5 however it was revealed that, by using computer simulation methods based on the two-stage elastic wave equation in which two independent variables of stress and particle velocity are used, visualization calculation of ultrasonic propagation between air and solid was possible. In this report, the calculation of air-coupled ultrasound using the new Improved-FDM for computer simulation of ultrasonic propagation in solids is shown. Waveforms obtained by 1-dimensional calculation are discussed for principle and performance of the calculation. Visualization of ultrasonic incidence to cylindrical steel pipe is demonstrated as an example to show availability for ultrasonic testing.

  10. Lessons from computer simulations of ablation of atrial fibrillation.

    PubMed

    Jacquemet, Vincent

    2016-05-01

    This paper reviews the simulations of catheter ablation in computer models of the atria, from the first attempts to the most recent anatomical models. It describes how postulated substrates of atrial fibrillation can be incorporated into mathematical models, how modelling studies can be designed to test ablation strategies, what their current trade-offs and limitations are, and what clinically relevant lessons can be learnt from these simulations. Drawing a parallel between clinical and modelling studies, six ablation targets are considered: pulmonary vein isolation, linear ablation, ectopic foci, complex fractionated atrial electrogram, rotors and ganglionated plexi. The examples presented for each ablation target illustrate a major advantage of computer models, the ability to identify why a therapy is successful or not in a given atrial fibrillation substrate. The integration of pathophysiological data to create detailed models of arrhythmogenic substrates is expected to solidify the understanding of ablation mechanisms and to provide theoretical arguments supporting substrate-specific ablation strategies. PMID:26846178

  11. Computational simulation of flows in an entire centrifugal heart pump.

    PubMed

    Nakamura, S; Yano, K

    1999-06-01

    A prototype computational code to numerically simulate the blood flows in an entire centrifugal heart pump has been developed. The unsteady incompressible Navier-Stokes equations are solved on a parallel computer, the Cray T3E. By domain decomposition, the whole flow space is decomposed to a number of subdomains for each of which a structured algebraic grid is assigned. The grids for the inlet eye and blade regions are on the rotating frame while grids for other regions are on the nonrotating frame, and the edge of the rotating grids slides over the edge of the nonrotating frame, and the edge of the rotating grids slides over the edge of the nonrotating grids. The code is able to simulate the flows in the rotor, volute, and diffuser as well as to find pump performance indicators. The present paper presents an overview of the code and describes a study on the effect of volute width. PMID:10392287

  12. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  13. Flexing computational muscle: modeling and simulation of musculotendon dynamics.

    PubMed

    Millard, Matthew; Uchida, Thomas; Seth, Ajay; Delp, Scott L

    2013-02-01

    Muscle-driven simulations of human and animal motion are widely used to complement physical experiments for studying movement dynamics. Musculotendon models are an essential component of muscle-driven simulations, yet neither the computational speed nor the biological accuracy of the simulated forces has been adequately evaluated. Here we compare the speed and accuracy of three musculotendon models: two with an elastic tendon (an equilibrium model and a damped equilibrium model) and one with a rigid tendon. Our simulation benchmarks demonstrate that the equilibrium and damped equilibrium models produce similar force profiles but have different computational speeds. At low activation, the damped equilibrium model is 29 times faster than the equilibrium model when using an explicit integrator and 3 times faster when using an implicit integrator; at high activation, the two models have similar simulation speeds. In the special case of simulating a muscle with a short tendon, the rigid-tendon model produces forces that match those generated by the elastic-tendon models, but simulates 2-54 times faster when an explicit integrator is used and 6-31 times faster when an implicit integrator is used. The equilibrium, damped equilibrium, and rigid-tendon models reproduce forces generated by maximally-activated biological muscle with mean absolute errors less than 8.9%, 8.9%, and 20.9% of the maximum isometric muscle force, respectively. When compared to forces generated by submaximally-activated biological muscle, the forces produced by the equilibrium, damped equilibrium, and rigid-tendon models have mean absolute errors less than 16.2%, 16.4%, and 18.5%, respectively. To encourage further development of musculotendon models, we provide implementations of each of these models in OpenSim version 3.1 and benchmark data online, enabling others to reproduce our results and test their models of musculotendon dynamics. PMID:23445050

  14. pV3-Gold Visualization Environment for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa L.

    1997-01-01

    A new visualization environment, pV3-Gold, can be used during and after a computer simulation to extract and visualize the physical features in the results. This environment, which is an extension of the pV3 visualization environment developed at the Massachusetts Institute of Technology with guidance and support by researchers at the NASA Lewis Research Center, features many tools that allow users to display data in various ways.

  15. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    SciTech Connect

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

  16. Numerical simulations of the thermoacoustic computed tomography breast imaging system

    NASA Astrophysics Data System (ADS)

    Kiser, William Lester, Jr.

    A thermoacoustic wave is produced when an object absorbs energy and experiences a subsequent thermal expansion. We have developed a Thermoacoustic Computed Tomography (TACT) breast imaging system to exploit the thermoacoustic phenomena as a method of soft tissue imaging. By exposing the breast to short pulses of 434 MHz microwaves, ultrasonic pulses are generated and detected with a hemispherical transducer array submersed in a water bath. Filtering and back projecting the transducer signals generates a 3-D image that maps the localized microwave absorption properties of the breast. In an effort to understand the factors limiting image quality, the TACT system was numerically simulated. The simulations were used to generate the transducer signals that would be collected by the TACT system during a scan of an object. These simulated data streams were then fed into the system image reconstruction software to provide images of simulated phantoms. The effects of transducer diameter, transducer response, transducer array geometry and stimulating pulse width on the spatial and contrast resolution of the system were quantified using the simulations. The spatial resolution was highly dependent upon location in the imaging volume. This was due to the off axis response of transducers of finite aperture. Simulated data were compared with experimental data, obtained by imaging a parallel-piped resolution phantom, to verify the accuracy of the simulation code. A contrast-detail phantom was numerically simulated to determine the ability of the system to image spheres of diameters <1 cm with absorption values on the order of physiologic saline, when located in a background of noise. The results of the contrast-detail analysis were dependent on the location of the spheres in the imaging volume and the diameter of the simulated transducers. This work sets the foundation for the initial image quality studies of the TACT system. Improvements to the current imaging system, based on

  17. Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes

    ERIC Educational Resources Information Center

    West, Jan; Veenstra, Anneke

    2012-01-01

    Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…

  18. Some computer simulations based on the linear relative risk model

    SciTech Connect

    Gilbert, E.S.

    1991-10-01

    This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs.

  19. Improving computational efficiency of Monte Carlo simulations with variance reduction

    SciTech Connect

    Turner, A.

    2013-07-01

    CCFE perform Monte-Carlo transport simulations on large and complex tokamak models such as ITER. Such simulations are challenging since streaming and deep penetration effects are equally important. In order to make such simulations tractable, both variance reduction (VR) techniques and parallel computing are used. It has been found that the application of VR techniques in such models significantly reduces the efficiency of parallel computation due to 'long histories'. VR in MCNP can be accomplished using energy-dependent weight windows. The weight window represents an 'average behaviour' of particles, and large deviations in the arriving weight of a particle give rise to extreme amounts of splitting being performed and a long history. When running on parallel clusters, a long history can have a detrimental effect on the parallel efficiency - if one process is computing the long history, the other CPUs complete their batch of histories and wait idle. Furthermore some long histories have been found to be effectively intractable. To combat this effect, CCFE has developed an adaptation of MCNP which dynamically adjusts the WW where a large weight deviation is encountered. The method effectively 'de-optimises' the WW, reducing the VR performance but this is offset by a significant increase in parallel efficiency. Testing with a simple geometry has shown the method does not bias the result. This 'long history method' has enabled CCFE to significantly improve the performance of MCNP calculations for ITER on parallel clusters, and will be beneficial for any geometry combining streaming and deep penetration effects. (authors)

  20. Scalable High Performance Computing: Direct and Large-Eddy Turbulent Flow Simulations Using Massively Parallel Computers

    NASA Technical Reports Server (NTRS)

    Morgan, Philip E.

    2004-01-01

    This final report contains reports of research related to the tasks "Scalable High Performance Computing: Direct and Lark-Eddy Turbulent FLow Simulations Using Massively Parallel Computers" and "Devleop High-Performance Time-Domain Computational Electromagnetics Capability for RCS Prediction, Wave Propagation in Dispersive Media, and Dual-Use Applications. The discussion of Scalable High Performance Computing reports on three objectives: validate, access scalability, and apply two parallel flow solvers for three-dimensional Navier-Stokes flows; develop and validate a high-order parallel solver for Direct Numerical Simulations (DNS) and Large Eddy Simulation (LES) problems; and Investigate and develop a high-order Reynolds averaged Navier-Stokes turbulence model. The discussion of High-Performance Time-Domain Computational Electromagnetics reports on five objectives: enhancement of an electromagnetics code (CHARGE) to be able to effectively model antenna problems; utilize lessons learned in high-order/spectral solution of swirling 3D jets to apply to solving electromagnetics project; transition a high-order fluids code, FDL3DI, to be able to solve Maxwell's Equations using compact-differencing; develop and demonstrate improved radiation absorbing boundary conditions for high-order CEM; and extend high-order CEM solver to address variable material properties. The report also contains a review of work done by the systems engineer.

  1. The smart vapor retarder: An innovation inspired by computer simulations

    SciTech Connect

    Kuenzel, H.M.

    1998-12-31

    Water management is the new trend in civil engineering. Since it is difficult to ensure perfect vapor- and watertightness of building components, a limited moisture ingress is acceptable as long as the drying process is effective enough to avoid moisture damage. Recent computer models for the simulation of heat and moisture transport are valuable tools for the risk assessment of structures and their repair or retrofit. Unventilated, insulated assemblies with a vapor-resistant exterior layer can accumulate water because winter condensation and summer drying are not balanced. The balance can be reestablished if the vapor retarder is more permeable in summer than in winter. Parametric computer studies have defined the required properties of such a vapor retarder. Developed according to the computed specifications, the smart vapor retarder shows a seasonal variation in vapor permeability of a factor of ten. The secret of this behavior lies in the humidity-dependent vapor diffusion resistance of the film material.

  2. An FPGA computing demo core for space charge simulation

    SciTech Connect

    Wu, Jinyuan; Huang, Yifei; /Fermilab

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  3. Addition of flexible body option to the TOLA computer program, part 1

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    This report describes a flexible body option that was developed and added to the Takeoff and Landing Analysis (TOLA) computer program. The addition of the flexible body option to TOLA allows it to be used to study essentially any conventional type airplane in the ground operating environment. It provides the capability to predict the total motion of selected points on the analytical methods incorporated in the program and operating instructions for the option are described. A program listing is included along with several example problems to aid in interpretation of the operating instructions and to illustrate program usage.

  4. Precision Constraints from Computational Cosmology and Type Ia Supernova Simulations

    NASA Astrophysics Data System (ADS)

    Bernstein, Joseph P.; Kuhlmann, S. E.; Norris, B.; Biswas, R.

    2011-01-01

    The evidence for dark energy represents one of the greatest mysteries of modern science. The research undertaken probes the implications of dark energy via analysis of large scale structure and detonation-based Type Ia supernova light curve simulations. It is presently an exciting time to be involved in cosmology because planned astronomical surveys will effectively result in dark sector probes becoming systematics-limited, making numerical simulations crucial to the formulation of precision constraints. This work aims to assist in reaching the community goal of 1% constraints on the dark energy equation of state parameter. Reaching this goal will require 1) hydrodynamic+N-body simulations with a minimum of a 1 Gpc box size, 20483 hydrodynamic cells, and 1011 dark matter particles, which push the limits of existing codes, and 2) a better understanding of the explosion mechanism(s) for Type Ia supernovae, together with larger, high-quality data sets from present and upcoming supernova surveys. Initial results are discussed from two projects. The first is computational cosmology studies aimed at enabling the large simulations discussed above. The second is radiative transfer calculations drawn from Type Ia supernova explosion simulations aimed at bridging the gap between simulated light curves and those observed from, e.g., the Sloan Digital Sky Survey II and, eventually, the Dark Energy Survey.

  5. Numerical simulation of landfill aeration using computational fluid dynamics.

    PubMed

    Fytanidis, Dimitrios K; Voudrias, Evangelos A

    2014-04-01

    The present study is an application of Computational Fluid Dynamics (CFD) to the numerical simulation of landfill aeration systems. Specifically, the CFD algorithms provided by the commercial solver ANSYS Fluent 14.0, combined with an in-house source code developed to modify the main solver, were used. The unsaturated multiphase flow of air and liquid phases and the biochemical processes for aerobic biodegradation of the organic fraction of municipal solid waste were simulated taking into consideration their temporal and spatial evolution, as well as complex effects, such as oxygen mass transfer across phases, unsaturated flow effects (capillary suction and unsaturated hydraulic conductivity), temperature variations due to biochemical processes and environmental correction factors for the applied kinetics (Monod and 1st order kinetics). The developed model results were compared with literature experimental data. Also, pilot scale simulations and sensitivity analysis were implemented. Moreover, simulation results of a hypothetical single aeration well were shown, while its zone of influence was estimated using both the pressure and oxygen distribution. Finally, a case study was simulated for a hypothetical landfill aeration system. Both a static (steadily positive or negative relative pressure with time) and a hybrid (following a square wave pattern of positive and negative values of relative pressure with time) scenarios for the aeration wells were examined. The results showed that the present model is capable of simulating landfill aeration and the obtained results were in good agreement with corresponding previous experimental and numerical investigations. PMID:24525420

  6. Effective computer simulation of equilibrium adsorption with limited solubility

    NASA Astrophysics Data System (ADS)

    Zedek, Lukáš; Šembera, Jan

    2014-12-01

    Radioactive waste repositories are discussed to be built in compact granite rock massif somewhere in the Czech Republic. To support the decision-making, various computer simulations are performed. These simulations help to evaluate risks connected with potential isolation failures, followed by transport of released radionuclides in the groundwater. Transport of radionuclides is affected by many natural processes. In this paper we focus on two important processes: adsorption and limited solubility (surface precipitation) of radionuclides in groundwater. Presented approach is based on interpolation instead of time demanding solution of nonlinear equations arising from mathematical description of the problem. The effectiveness of the proposed approach consists in creation of an interpolation table of points lying on an isotherm and transformed to mass balance maintaining system of coordinates. This kind of table is used to project data from transport part of simulation on the isotherm. The interpolation table points are computed using free derivative, nonlinear equation solver which avoids some of numerical solution difficulties. Results achieved with the use of our approach are comparable to Newton-Raphson method achieved solutions, but the simulation times are shorter.

  7. Computer simulations of a tumor surface octapeptide epitope.

    PubMed

    Reid, R H; Hooper, C A; Brooks, B R

    1989-01-01

    Molecular dynamics using CHARMM and GEMM programs with the Star Technologies ST 100 array processor functioning at the speed of super computers was used as a searching algorithm for conformational exploration of the octapeptide Gly-Asn-Thr-Ile-Val-Ala-Glu. This poorly soluble octapeptide is the N-terminal epitope of an 11 KD glycoprotein antigen residing on human ductal carcinoma (breast) cells. Very long (nanoseconds) simulations were required. Both an alpha-helix and the N-acetyl-N1-methylamide derived minimized starting structures gave the same lowest potential energy conformation with simulations at 600 K. The same conformation was found only when using the latter starting conformation with simulations at 300 K. The lowest potential energy conformation was stabilized by 4 hydrophobic contacts and 13 H bonds completing one turn of a left-handed helix. PMID:2470439

  8. Basic plasma and fusion theory and computer simulations survey

    SciTech Connect

    Kawakami, I.; Nishikawa, K.

    1983-12-01

    The College of Science and Technology at Nihon University and the Institute for Fusion Theory at Hiroshima University discuss the history of the role of theory and simulation in fusion-oriented research. Recent activities include a one-dimensional tokamak transport code at Nagoya University and three-dimensional resistive MHD simulation studies of spheromaks. Other recent activities discussed include the tokamak computer code system TRITON, transport flux in currentless ECH-produced plasma in Heliotron-E, and thermal electron transport in the presence of a steep temperature gradient. The Japan-U.S. Joint Institute for Fusion Theory's present activities are discussed, including subject areas in three-dimensional simulation studies, nonequilibrium statistical physics, anaomalous transport and drift wave turbulence and hot-electron physics.

  9. Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models

    SciTech Connect

    Cook, Christopher B.; Richmond, Marshall C.

    2001-05-01

    This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.

  10. Lightweight computational steering of very large scale molecular dynamics simulations

    SciTech Connect

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.

  11. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  12. Molecular Dynamic Simulations of Nanostructured Ceramic Materials on Parallel Computers

    SciTech Connect

    Vashishta, Priya; Kalia, Rajiv

    2005-02-24

    Large-scale molecular-dynamics (MD) simulations have been performed to gain insight into: (1) sintering, structure, and mechanical behavior of nanophase SiC and SiO2; (2) effects of dynamic charge transfers on the sintering of nanophase TiO2; (3) high-pressure structural transformation in bulk SiC and GaAs nanocrystals; (4) nanoindentation in Si3N4; and (5) lattice mismatched InAs/GaAs nanomesas. In addition, we have designed a multiscale simulation approach that seamlessly embeds MD and quantum-mechanical (QM) simulations in a continuum simulation. The above research activities have involved strong interactions with researchers at various universities, government laboratories, and industries. 33 papers have been published and 22 talks have been given based on the work described in this report.

  13. Animats: computer-simulated animals in behavioral research.

    PubMed

    Watts, J M

    1998-10-01

    The term animat refers to a class of simulated animals. This article is intended as a nontechnical introduction to animat research. Animats can be robots interacting with the real world or computer simulations. In this article, the use of computer-generated animats is emphasized. The scientific use of animats has been pioneered by artificial intelligence and artificial life researchers. Behavior-based artificial intelligence uses animats capable of autonomous and adaptive activity as conceptual tools in the design of usefully intelligent systems. Artificial life proponents view some human artifacts, including informational structures that show adaptive behavior and self-replication, as animats may do, as analogous to biological organisms. Animat simulations may be used for rapid and inexpensive evaluation of new livestock environments or management techniques. The animat approach is a powerful heuristic for understanding the mechanisms that underlie behavior. The simple rules and capabilities of animat models generate emergent and sometimes unpredictable behavior. Adaptive variability in animat behavior may be exploited using artificial neural networks. These have computational properties similar to natural neurons and are capable of learning. Artificial neural networks can control behavior at all levels of an animat's functional organization. Improving the performance of animats often requires genetic programming. Genetic algorithms are computer programs that are capable of self-replication, simulating biological reproduction. Animats may thus evolve over generations. Selective forces may be provided by a human overseer or be part of the simulated environment. Animat techniques allow researchers to culture behavior outside the organism that usually produces it. This approach could contribute new insights in theoretical ethology on questions including the origins of social behavior and cooperation, adaptation, and the emergent nature of complex behavior. Animat

  14. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    PubMed Central

    Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

    2013-01-01

    Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  15. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines.

    PubMed

    Gansäuer, Andreas; Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca; Grimme, Stefan

    2013-01-01

    The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol(-1) and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG (‡) and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  16. Turbulence computations with 3-D small-scale additive turbulent decomposition and data-fitting using chaotic map combinations

    SciTech Connect

    Mukerji, S.

    1997-12-31

    Although the equations governing turbulent fluid flow, the Navier-Stokes (N.S.) equations, have been known for well over a century and there is a clear technological necessity in obtaining solutions to these equations, turbulence remains one of the principal unsolved problems in physics today. It is still not possible to make accurate quantitative predictions about turbulent flows without relying heavily on empirical data. In principle, it is possible to obtain turbulent solutions from a direct numerical simulation (DNS) of the N.-S. equations. The author first provides a brief introduction to the dynamics of turbulent flows. The N.-S. equations which govern fluid flow, are described thereafter. Then he gives a brief overview of DNS calculations and where they stand at present. He next introduces the two most popular approaches for doing turbulent computations currently in use, namely, the Reynolds averaging of the N.-S. equations (RANS) and large-eddy simulation (LES). Approximations, often ad hoc ones, are present in these methods because use is made of heuristic models for turbulence quantities (the Reynolds stresses) which are otherwise unknown. They then introduce a new computational method called additive turbulent decomposition (ATD), the small-scale version of which is the topic of this research. The rest of the thesis is organized as follows. In Chapter 2 he describes the ATD procedure in greater detail; how dependent variables are split and the decomposition into large- and small-scale sets of equations. In Chapter 3 the spectral projection of the small-scale momentum equations are derived in detail. In Chapter 4 results of the computations with the small-scale ATD equations are presented. In Chapter 5 he describes the data-fitting procedure which can be used to directly specify the parameters of a chaotic-map turbulence model.

  17. Average-Case Complexity Versus Approximate Simulation of Commuting Quantum Computations.

    PubMed

    Bremner, Michael J; Montanaro, Ashley; Shepherd, Dan J

    2016-08-19

    We use the class of commuting quantum computations known as IQP (instantaneous quantum polynomial time) to strengthen the conjecture that quantum computers are hard to simulate classically. We show that, if either of two plausible average-case hardness conjectures holds, then IQP computations are hard to simulate classically up to constant additive error. One conjecture relates to the hardness of estimating the complex-temperature partition function for random instances of the Ising model; the other concerns approximating the number of zeroes of random low-degree polynomials. We observe that both conjectures can be shown to be valid in the setting of worst-case complexity. We arrive at these conjectures by deriving spin-based generalizations of the boson sampling problem that avoid the so-called permanent anticoncentration conjecture. PMID:27588839

  18. Accelerating Climate and Weather Simulations through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  19. Further developments in cloud statistics for computer simulations

    NASA Technical Reports Server (NTRS)

    Chang, D. T.; Willand, J. H.

    1972-01-01

    This study is a part of NASA's continued program to provide global statistics of cloud parameters for computer simulation. The primary emphasis was on the development of the data bank of the global statistical distributions of cloud types and cloud layers and their applications in the simulation of the vertical distributions of in-cloud parameters such as liquid water content. These statistics were compiled from actual surface observations as recorded in Standard WBAN forms. Data for a total of 19 stations were obtained and reduced. These stations were selected to be representative of the 19 primary cloud climatological regions defined in previous studies of cloud statistics. Using the data compiled in this study, a limited study was conducted of the hemogeneity of cloud regions, the latitudinal dependence of cloud-type distributions, the dependence of these statistics on sample size, and other factors in the statistics which are of significance to the problem of simulation. The application of the statistics in cloud simulation was investigated. In particular, the inclusion of the new statistics in an expanded multi-step Monte Carlo simulation scheme is suggested and briefly outlined.

  20. Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation

    NASA Technical Reports Server (NTRS)

    Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.

    2012-01-01

    The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.

  1. Computational simulation of materials notes for lectures given at UCSB, May 1996--June 1996

    SciTech Connect

    LeSar, R.

    1997-01-01

    This report presents information from a lecture given on the computational simulation of materials. The purpose is to introduce modern computerized simulation methods for materials properties and response.

  2. Scientific and computational challenges of the fusion simulation project (FSP)

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2008-07-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER — a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied

  3. Computer simulations of realistic three-dimensional microstructures

    NASA Astrophysics Data System (ADS)

    Mao, Yuxiong

    A novel and efficient methodology is developed for computer simulations of realistic two-dimensional (2D) and three-dimensional (3D) microstructures. The simulations incorporate realistic 2D and 3D complex morphologies/shapes, spatial patterns, anisotropy, volume fractions, and size distributions of the microstructural features statistically similar to those in the corresponding real microstructures. The methodology permits simulations of sufficiently large 2D as well as 3D microstructural windows that incorporate short-range (on the order of particle/feature size) as well as long-range (hundred times the particle/feature size) microstructural heterogeneities and spatial patterns at high resolution. The utility of the technique has been successfully demonstrated through its application to the 2D microstructures of the constituent particles in wrought Al-alloys, the 3D microstructure of discontinuously reinforced Al-alloy (DRA) composites containing SiC particles that have complex 3D shapes/morphologies and spatial clustering, and 3D microstructure of boron modified Ti-6Al-4V composites containing fine TiB whiskers and coarse primary TiB particles. The simulation parameters are correlated with the materials processing parameters (such as composition, particle size ratio, extrusion ratio, extrusion temperature, etc.), which enables the simulations of rational virtual 3D microstructures for the parametric studies on microstructure-properties relationships. The simulated microstructures have been implemented in the 3D finite-elements (FE)-based framework for simulations of micro-mechanical response and stress-strain curves. Finally, a new unbiased and assumption free dual-scale virtual cycloids probe for estimating surface area of 3D objects constructed by 2D serial section images is also presented.

  4. Quantum game simulator, using the circuit model of quantum computation

    NASA Astrophysics Data System (ADS)

    Vlachos, Panagiotis; Karafyllidis, Ioannis G.

    2009-10-01

    We present a general two-player quantum game simulator that can simulate any two-player quantum game described by a 2×2 payoff matrix (two strategy games).The user can determine the payoff matrices for both players, their strategies and the amount of entanglement between their initial strategies. The outputs of the simulator are the expected payoffs of each player as a function of the other player's strategy parameters and the amount of entanglement. The simulator also produces contour plots that divide the strategy spaces of the game in regions in which players can get larger payoffs if they choose to use a quantum strategy against any classical one. We also apply the simulator to two well-known quantum games, the Battle of Sexes and the Chicken game. Program summaryProgram title: Quantum Game Simulator (QGS) Catalogue identifier: AEED_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEED_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3416 No. of bytes in distributed program, including test data, etc.: 583 553 Distribution format: tar.gz Programming language: Matlab R2008a (C) Computer: Any computer that can sufficiently run Matlab R2008a Operating system: Any system that can sufficiently run Matlab R2008a Classification: 4.15 Nature of problem: Simulation of two player quantum games described by a payoff matrix. Solution method: The program calculates the matrices that comprise the Eisert setup for quantum games based on the quantum circuit model. There are 5 parameters that can be altered. We define 3 of them as constant. We play the quantum game for all possible values for the other 2 parameters and store the results in a matrix. Unusual features: The software provides an easy way of simulating any two-player quantum games. Running time: Approximately

  5. Use of a Computer Simulation To Develop Mental Simulations for Understanding Relative Motion Concepts.

    ERIC Educational Resources Information Center

    Monaghan, James M.; Clement, John

    1999-01-01

    Presents evidence for students' qualitative and quantitative difficulties with apparently simple one-dimensional relative-motion problems, students' spontaneous visualization of relative-motion problems, the visualizations facilitating solution of these problems, and students' memories of the online computer simulation used as a framework for…

  6. Benchmarking computational fluid dynamics models for lava flow simulation

    NASA Astrophysics Data System (ADS)

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi

    2016-04-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, and COMSOL. Using the new benchmark scenarios defined in Cordonnier et al. (Geol Soc SP, 2015) as a guide, we model viscous, cooling, and solidifying flows over horizontal and sloping surfaces, topographic obstacles, and digital elevation models of natural topography. We compare model results to analytical theory, analogue and molten basalt experiments, and measurements from natural lava flows. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We can apply these models to reconstruct past lava flows in Hawai'i and Saudi Arabia using parameters assembled from morphology, textural analysis, and eruption observations as natural test cases. Our study highlights the strengths and weaknesses of each code, including accuracy and computational costs, and provides insights regarding code selection.

  7. Trace contaminant control simulation computer program, version 8.1

    NASA Technical Reports Server (NTRS)

    Perry, J. L.

    1994-01-01

    The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various process technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. Included in the simulation are chemical and physical adsorption by activated charcoal, chemical adsorption by lithium hydroxide, absorption by humidity condensate, and low- and high-temperature catalytic oxidation. Means are provided for simulating regenerable as well as nonregenerable systems. The program provides an overall mass balance of chemical contaminants in a spacecraft cabin given specified generation rates. Removal rates are based on device flow rates specified by the user and calculated removal efficiencies based on cabin concentration and removal technology experimental data. Versions 1.0 through 8.0 are documented in NASA TM-108409. TM-108409 also contains a source file listing for version 8.0. Changes to version 8.0 are documented in this technical memorandum and a source file listing for the modified version, version 8.1, is provided. Detailed descriptions for the computer program subprograms are extracted from TM-108409 and modified as necessary to reflect version 8.1. Version 8.1 supersedes version 8.0. Information on a separate user's guide is available from the author.

  8. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio

    2003-04-25

    This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two gasifier types. An improved process model for simulating entrained flow gasifiers has been implemented into the workbench. Model development has focused on: a pre-processor module to compute global gasification parameters from standard fuel properties and intrinsic rate information; a membrane based water gas shift; and reactors to oxidize fuel cell exhaust gas. The data visualization capabilities of the workbench have been extended by implementing the VTK visualization software that supports advanced visualization methods, including inexpensive Virtual Reality techniques. The ease-of-use, functionality and plug-and-play features of the workbench were highlighted through demonstrations of the workbench at a DOE sponsored coal utilization conference. A white paper has been completed that contains recommendations on the use of component architectures, model interface protocols and software frameworks for developing a Vision 21 plant simulator.

  9. Simulating Subsurface Reactive Flows on Ultrascale Computers with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hammond, G. E.; Lichtner, P. C.; Lu, C.; Smith, B. F.; Philip, B.

    2009-12-01

    To provide true predictive utility, subsurface simulations often must accurately resolve--in three dimensions--complicated, multi-phase flow fields in highly heterogeneous geology with numerous chemical species and complex chemistry. This task is especially daunting because of the wide range of spatial scales involved--from the pore scale to the field scale--ranging over six orders of magnitude, and the wide range of time scales ranging from seconds or less to millions of years. This represents a true "Grand Challenge" computational problem, requiring not only the largest-scale ("ultrascale") supercomputers, but accompanying advances in algorithms for the efficient numerical solution of systems of PDEs using these machines, and in mathematical modeling techniques that can adequately capture the truly multi-scale nature of these problems. We describe some of the specific challenges involved and present the software and algorithmic approaches that are being using in the computer code PFLOTRAN to provide scalable performance for such simulations on tens of thousands of processors. We focus particularly on scalable techniques for solving the large (up to billions of total degrees of freedom), sparse algebraic systems that arise. We also describe ongoing work to address disparate time and spatial scales by both the development of adaptive mesh refinement methods and the use of multiple continuum formulations. Finally, we present some examples from recent simulations conducted on Jaguar, the 150152 processor core Cray XT5 system at Oak Ridge National Laboratory that is currently one of the most powerful supercomputers in the world.

  10. Textbook Multigrid Efficiency for Computational Fluid Dynamics Simulations

    NASA Technical Reports Server (NTRS)

    Brandt, Achi; Thomas, James L.; Diskin, Boris

    2001-01-01

    Considerable progress over the past thirty years has been made in the development of large-scale computational fluid dynamics (CFD) solvers for the Euler and Navier-Stokes equations. Computations are used routinely to design the cruise shapes of transport aircraft through complex-geometry simulations involving the solution of 25-100 million equations; in this arena the number of wind-tunnel tests for a new design has been substantially reduced. However, simulations of the entire flight envelope of the vehicle, including maximum lift, buffet onset, flutter, and control effectiveness have not been as successful in eliminating the reliance on wind-tunnel testing. These simulations involve unsteady flows with more separation and stronger shock waves than at cruise. The main reasons limiting further inroads of CFD into the design process are: (1) the reliability of turbulence models; and (2) the time and expense of the numerical simulation. Because of the prohibitive resolution requirements of direct simulations at high Reynolds numbers, transition and turbulence modeling is expected to remain an issue for the near term. The focus of this paper addresses the latter problem by attempting to attain optimal efficiencies in solving the governing equations. Typically current CFD codes based on the use of multigrid acceleration techniques and multistage Runge-Kutta time-stepping schemes are able to converge lift and drag values for cruise configurations within approximately 1000 residual evaluations. An optimally convergent method is defined as having textbook multigrid efficiency (TME), meaning the solutions to the governing system of equations are attained in a computational work which is a small (less than 10) multiple of the operation count in the discretized system of equations (residual equations). In this paper, a distributed relaxation approach to achieving TME for Reynolds-averaged Navier-Stokes (RNAS) equations are discussed along with the foundations that form the

  11. An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation

    NASA Technical Reports Server (NTRS)

    Bartos, R. D.

    1993-01-01

    Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.

  12. Fast computer simulation of reconstructed image from rainbow hologram based on GPU

    NASA Astrophysics Data System (ADS)

    Shuming, Jiao; Yoshikawa, Hiroshi

    2015-10-01

    A fast computer simulation solution for rainbow hologram reconstruction based on GPU is proposed. In the commonly used segment Fourier transform method for rainbow hologram reconstruction, the computation of 2D Fourier transform on each hologram segment is very time consuming. GPU-based parallel computing can be applied to improve the computing speed. Compared with CPU computing, simulation results indicate that our proposed GPU computing can effectively reduce the computation time by as much as eight times.

  13. A digital computer program for the dynamic interaction simulation of controls and structure (DISCOS), volume 1

    NASA Technical Reports Server (NTRS)

    Bodley, C. S.; Devers, A. D.; Park, A. C.; Frisch, H. P.

    1978-01-01

    A theoretical development and associated digital computer program system for the dynamic simulation and stability analysis of passive and actively controlled spacecraft are presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system is used to investigate total system dynamic characteristics, including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. In addition, the program system is used for designing attitude control systems and for evaluating total dynamic system performance, including time domain response and frequency domain stability analyses.

  14. Computing the acoustic radiation force exerted on a sphere using the translational addition theorem.

    PubMed

    Silva, Glauber T; Baggio, André L; Lopes, J Henrique; Mitri, Farid G

    2015-03-01

    In this paper, the translational addition theorem for spherical functions is employed to calculate the acoustic radiation force produced by an arbitrary shaped beam on a sphere arbitrarily suspended in an inviscid fluid. The procedure is also based on the partial-wave expansion method, which depends on the beam-shape and scattering coefficients. Given a set of beam-shape coefficients (BSCs) for an acoustic beam relative to a reference frame, the translational addition theorem can be used to obtain the BSCs relative to the sphere positioned anywhere in the medium. The scattering coefficients are obtained from the acoustic boundary conditions across the sphere's surface. The method based on the addition theorem is particularly useful to avoid quadrature schemes to obtain the BSCs. We use it to compute the acoustic radiation force exerted by a spherically focused beam (in the paraxial approximation) on a silicone-oil droplet (compressible fluid sphere). The analysis is carried out in the Rayleigh (i.e., the particle diameter is much smaller than the wavelength) and Mie (i.e., the particle diameter is of the order of the wavelength or larger) scattering regimes. The obtained results show that the paraxial focused beam can only trap particles in the Rayleigh scattering regime. PMID:25768823

  15. Krylov subspace methods for computing hydrodynamic interactions in Brownian dynamics simulations

    PubMed Central

    Ando, Tadashi; Chow, Edmond; Saad, Yousef; Skolnick, Jeffrey

    2012-01-01

    Hydrodynamic interactions play an important role in the dynamics of macromolecules. The most common way to take into account hydrodynamic effects in molecular simulations is in the context of a Brownian dynamics simulation. However, the calculation of correlated Brownian noise vectors in these simulations is computationally very demanding and alternative methods are desirable. This paper studies methods based on Krylov subspaces for computing Brownian noise vectors. These methods are related to Chebyshev polynomial approximations, but do not require eigenvalue estimates. We show that only low accuracy is required in the Brownian noise vectors to accurately compute values of dynamic and static properties of polymer and monodisperse suspension models. With this level of accuracy, the computational time of Krylov subspace methods scales very nearly as O(N2) for the number of particles N up to 10 000, which was the limit tested. The performance of the Krylov subspace methods, especially the “block” version, is slightly better than that of the Chebyshev method, even without taking into account the additional cost of eigenvalue estimates required by the latter. Furthermore, at N = 10 000, the Krylov subspace method is 13 times faster than the exact Cholesky method. Thus, Krylov subspace methods are recommended for performing large-scale Brownian dynamics simulations with hydrodynamic interactions. PMID:22897254

  16. Laser Additive Melting and Solidification of Inconel 718: Finite Element Simulation and Experiment

    NASA Astrophysics Data System (ADS)

    Romano, John; Ladani, Leila; Sadowski, Magda

    2016-03-01

    The field of powdered metal additive manufacturing is experiencing a surge in public interest finding uses in aerospace, defense, and biomedical industries. The relative youth of the technology coupled with public interest makes the field a vibrant research topic. The authors have expanded upon previously published finite element models used to analyze the processing of novel engineering materials through the use of laser- and electron beam-based additive manufacturing. In this work, the authors present a model for simulating fabrication of Inconel 718 using laser melting processes. Thermal transport phenomena and melt pool geometries are discussed and validation against experimental findings is presented. After comparing experimental and simulation results, the authors present two correction correlations to transform the modeling results into meaningful predictions of actual laser melting melt pool geometries in Inconel 718.

  17. Computer simulation of channeling spectra with the cassis program

    NASA Astrophysics Data System (ADS)

    Kling, A.; Soares, J. C.; Silva, M. F. Da

    The recently developed Monte Carlo program CASSIS for the simulation of channeling phenomena was upgraded in order to enable the generation of energy spectra for Rutherford backscattering in channeling conditions. The basic concepts of the module added are described and discussed. The feasibility of the program is demonstrated by comparing experimental and simulation results for simple bulk materials like diamond and silicon as well as for the trigonal lithium niobate lattice. For all three cases excellent agreement between experiment and calculation were obtained. Additionally spectra measured for a fully strained Si0.75Ge0.25 layers on silicon, which show interesting channeling effects, e.g. catastrophic dechanneling were successfully simulated.

  18. A Computer-Based Simulation of an Acid-Base Titration

    ERIC Educational Resources Information Center

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  19. Relative performances of several scientific computers for a liquid molecular dynamics simulation. [Computers tested are: VAX 11/70, CDC 7600, CRAY-1, CRAY-1*, VAX-FPSAP

    SciTech Connect

    Ceperley, D.M.

    1980-08-01

    Some of the computational characteristics of simulations and the author's experience in using his standard simulation program called CLAMPS on several scientific computers are discussed. CLAMPS is capable of performing Metropolis Monte Carlo and Molecular Dynamics simulations of arbitrary mixtures of single atoms. The computational characteristics of simulations and what makes a good simulation computer are also summarized.

  20. Computational strategies in the dynamic simulation of constrained flexible MBS

    NASA Technical Reports Server (NTRS)

    Amirouche, F. M. L.; Xie, M.

    1993-01-01

    This research focuses on the computational dynamics of flexible constrained multibody systems. At first a recursive mapping formulation of the kinematical expressions in a minimum dimension as well as the matrix representation of the equations of motion are presented. The method employs Kane's equation, FEM, and concepts of continuum mechanics. The generalized active forces are extended to include the effects of high temperature conditions, such as creep, thermal stress, and elastic-plastic deformation. The time variant constraint relations for rolling/contact conditions between two flexible bodies are also studied. The constraints for validation of MBS simulation of gear meshing contact using a modified Timoshenko beam theory are also presented. The last part deals with minimization of vibration/deformation of the elastic beam in multibody systems making use of time variant boundary conditions. The above methodologies and computational procedures developed are being implemented in a program called DYAMUS.

  1. Large Eddy Simulation in the Computation of Jet Noise

    NASA Technical Reports Server (NTRS)

    Mankbadi, R. R.; Goldstein, M. E.; Povinelli, L. A.; Hayder, M. E.; Turkel, E.

    1999-01-01

    Noise can be predicted by solving Full (time-dependent) Compressible Navier-Stokes Equation (FCNSE) with computational domain. The fluctuating near field of the jet produces propagating pressure waves that produce far-field sound. The fluctuating flow field as a function of time is needed in order to calculate sound from first principles. Noise can be predicted by solving the full, time-dependent, compressible Navier-Stokes equations with the computational domain extended to far field - but this is not feasible as indicated above. At high Reynolds number of technological interest turbulence has large range of scales. Direct numerical simulations (DNS) can not capture the small scales of turbulence. The large scales are more efficient than the small scales in radiating sound. The emphasize is thus on calculating sound radiated by large scales.

  2. Probabilistic lifetime strength of aerospace materials via computational simulation

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Keating, Jerome P.; Lovelace, Thomas B.; Bast, Callie C.

    1991-01-01

    The results of a second year effort of a research program are presented. The research included development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic phenomenological constitutive relationship, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects of primitive variables. These primitive variables often originate in the environment and may include stress from loading, temperature, chemical, or radiation attack. This multifactor interaction constitutive equation is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the constitutive equation using actual experimental materials data together with the multiple linear regression of that data.

  3. Computational simulation of intermingled-fiber hybrid composite behavior

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Chamis, Christos C.

    1992-01-01

    Three-dimensional finite-element analysis and a micromechanics based computer code ICAN (Integrated Composite Analyzer) are used to predict the composite properties and microstresses of a unidirectional graphite/epoxy primary composite with varying percentages of S-glass fibers used as hydridizing fibers at a total fiber volume of 0.54. The three-dimensional finite-element model used in the analyses consists of a group of nine fibers, all unidirectional, in a three-by-three unit cell array. There is generally good agreement between the composite properties and microstresses obtained from both methods. The results indicate that the finite-element methods and the micromechanics equations embedded in the ICAN computer code can be used to obtain the properties of intermingled fiber hybrid composites needed for the analysis/design of hybrid composite structures. However, the finite-element model should be big enough to be able to simulate the conditions assumed in the micromechanics equations.

  4. Simulation of computed radiography with imaging plate detectors

    SciTech Connect

    Tisseur, D.; Costin, M.; Mathy, F.; Schumm, A.

    2014-02-18

    Computed radiography (CR) using phosphor imaging plate detectors is taking an increasing place in Radiography Testing. CR uses similar equipment as conventional radiography except that the classical X-ray film is replaced by a numerical detector, called image plate (IP), which is made of a photostimulable layer and which is read by a scanning device through photostimulated luminescence. Such digital radiography has already demonstrated important benefits in terms of exposure time, decrease of source energies and thus reduction of radioprotection area besides being a solution without effluents. This paper presents a model for the simulation of radiography with image plate detectors in CIVA together with examples of validation of the model. The study consists in a cross comparison between experimental and simulation results obtained on a step wedge with a classical X-ray tube. Results are proposed in particular with wire Image quality Indicator (IQI) and duplex IQI.

  5. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  6. Computer simulation in template-directed oligonucleotide synthesis

    NASA Technical Reports Server (NTRS)

    Kanavarioti, Anastassia; Benasconi, Claude F.

    1990-01-01

    It is commonly assumed that template-directed polymerizations have played a key role in prebiotic evolution. A computer simulation that models up to 33 competing reactions was used to investigate the product distribution in a template-directed oligonucleotide synthesis as a function of time and concentration of the reactants. The study focuses on the poly(C)-directed elongation reaction of oligoguanylates, and how it is affected by the competing processes of hydrolysis and dimerization of the activated monomer, which have the potential of severely curtailing the elongation and reducing the size and yield of the synthesized polymers. The simulations show that realistic and probably prebiotically plausible conditions can be found where hydrolysis and dimerization are either negligible or where a high degree of polymerization can be attained even in the face of substantial hydrolysis and/or dimerization.

  7. Computer Simulation of Bubble Growth in Metals Due to He

    SciTech Connect

    FOILES, STEPHEN M.; HOYT, JEFFREY J.

    2001-03-01

    Atomistic simulations of the growth of helium bubbles in metals are performed. The metal is represented by embedded atom method potentials for palladium. The helium bubbles are treated via an expanding repulsive spherical potential within the metal lattice. The simulations predict bubble pressures that decrease monotonically with increasing helium to metal ratios. The swelling of the material associated with the bubble growth is also computed. It is found that the rate of swelling increases with increasing helium to metal ratio consistent with experimental observations on the swelling of metal tritides. Finally, the detailed defect structure due to the bubble growth was investigated. Dislocation networks are observed to form that connect the bubbles. Unlike early model assumptions, prismatic loops between the bubbles are not retained. These predictions are compared to available experimental evidence.

  8. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio

    2002-07-28

    This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.

  9. Software Development Processes Applied to Computational Icing Simulation

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  10. Simulating the impacts of fire: A computer program

    NASA Astrophysics Data System (ADS)

    Ffolliott, Peter F.; Guertin, D. Phillip; Rasmussen, William D.

    1988-11-01

    Recurrent fire has played a dominant role in the ecology of southwestern ponderosa pine forests. To assess the benefits or losses of fire in these forests, a computer simulation model, called BURN, considers vegetation (mortality, regeneration, and production of herbaceous vegetation), wildlife (populations and habitats), and hydrology (streamflow and water quality). In the formulation of the model, graphical representations (time-trend response curves) of increases or losses (compared to an unburned control) after the occurrence of fire are converted to fixedterm annual ratios, and then annuities for the simulation components. Annuity values higher than 1.0 indicate benefits, while annuity values lower than 1.0 indicate losses. Studies in southwestern ponderosa pine forests utilized in the development of BURN are described briefly.

  11. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    SciTech Connect

    William M. Tang

    2011-02-09

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  12. Computational Materials: Modeling and Simulation of Nanostructured Materials and Systems

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Hinkley, Jeffrey A.

    2003-01-01

    The paper provides details on the structure and implementation of the Computational Materials program at the NASA Langley Research Center. Examples are given that illustrate the suggested approaches to predicting the behavior and influencing the design of nanostructured materials such as high-performance polymers, composites, and nanotube-reinforced polymers. Primary simulation and measurement methods applicable to multi-scale modeling are outlined. Key challenges including verification and validation of models are highlighted and discussed within the context of NASA's broad mission objectives.

  13. Computer Simulation of Einstein-Podolsky-Rosen-Bohm Experiments

    NASA Astrophysics Data System (ADS)

    de Raedt, H.; Michielsen, K.

    2016-07-01

    We review an event-based simulation approach which reproduces the statistical distributions of quantum physics experiments by generating detection events one-by-one according to an unknown distribution and without solving a wave equation. Einstein-Podolsky-Rosen-Bohm laboratory experiments are used as an example to illustrate the applicability of this approach. It is shown that computer experiments that employ the same post-selection procedure as the one used in laboratory experiments produce data that is in excellent agreement with quantum theory.

  14. [Multiparticle computer simulation of protein interactions in the photosynthetic membrane].

    PubMed

    Riznichenko, G Iu; Kovalenko, I B; Abaturova, A M; D'iakonova, A N; Kniazeva, O S; Ustinin, D M; Khrushchev, S S; Rubin, A B

    2011-01-01

    The basic principles of the design of direct multiparticle models and the results of multiparticle computer simulation of electron transfer by mobile protein carriers in the photosynthetic membrane of a chloroplast thylakoid are presented. The reactions of complex formation of the protein plastocyanin with the protein cytochrome f and the pigment-protein complex of photosystem I, as well as of the protein ferredoxin with the protein FNR and photosystem 1 are considered. The role of diffusion and electrostatic interactions is discussed, and the effect of the shape of the reaction volume and ionic strength on the rate of electron transport are discussed. PMID:22117434

  15. Tracking Non-rigid Structures in Computer Simulations

    SciTech Connect

    Gezahegne, A; Kamath, C

    2008-01-10

    A key challenge in tracking moving objects is the correspondence problem, that is, the correct propagation of object labels from one time step to another. This is especially true when the objects are non-rigid structures, changing shape, and merging and splitting over time. In this work, we describe a general approach to tracking thousands of non-rigid structures in an image sequence. We show how we can minimize memory requirements and generate accurate results while working with only two frames of the sequence at a time. We demonstrate our results using data from computer simulations of a fluimix problem.

  16. A Computer Simulation for Teaching Quantal Time Development

    NASA Astrophysics Data System (ADS)

    Styer, Daniel F.

    1996-11-01

    The computer program QMTime (for MS-DOS machines) enables students to simulate quantal time development in one dimension. A variety of initial wave packets (Gaussian, Lorentzian, etc.) can evolve in time under the influence of a variety of potential energy functions (step, ramp, square well, harmonic oscillator, etc.) with or without an external driving force. A novel visualization technique simultaneously displays the magnitude and phase of complex-valued wavefunctions. Either position-space or momentum-space wavefunctions, or both, can be shown. The program is particularly effective in demonstrating the classical limit of quantum mechanics. This program is part of the CUPS (Consortium for Upper level Physics Software) project.

  17. Computer simulations of human interferon gamma mutated forms

    NASA Astrophysics Data System (ADS)

    Lilkova, E.; Litov, L.; Petkov, P.; Petkov, P.; Markov, S.; Ilieva, N.

    2010-01-01

    In the general framework of the computer-aided drug design, the method of molecular-dynamics simulations is applied for investigation of the human interferon-gamma (hIFN-γ) binding to its two known ligands (its extracellular receptor and the heparin-derived oligosaccharides). A study of 100 mutated hIFN-γ forms is presented, the mutations encompassing residues 86-88. The structural changes are investigated by comparing the lengths of the α-helices, in which these residues are included, in the native hIFN-γ molecule and in the mutated forms. The most intriguing cases are examined in detail.

  18. Computer simulation of a fringe type laser velocimeter

    NASA Technical Reports Server (NTRS)

    Meyers, J. F.; Walsh, M. J.

    1974-01-01

    A computer simulation of a fringe type laser velocimeter has been written to determine the theoretical characteristics of the laser velocimeter when applied to a given flow field. The program includes the effect of particle size and composition on particle lag, light scattering characteristics and signal contrast. The model of the laser velocimeter includes the laser, optical system, photomultiplier, and counter type data processing electronics. The LV particle size analyzer is also modeled and incorporated in the program. An example application of the program to a Mach 5 wind tunnel using a backscatter laser velocimeter is presented.

  19. Computational Strategies for Polymer Coated Steel Sheet Forming Simulations

    SciTech Connect

    Owen, D. R. J.; Andrade Pires, F. M.; Dutko, M.

    2007-05-17

    This contribution discusses current issues involved in the numerical simulation of large scale industrial forming processes that employ polymer coated steel sheet. The need for rigorous consideration of both theoretical and algorithmic issues is emphasized, particularly in relation to the computational treatment of finite strain deformation of polymer coated steel sheet in the presence of internal degradation. Other issues relevant to the effective treatment of the problem, including the modelling of frictional contact between the work piece and tools, low order element technology capable of dealing with plastic incompressibility and thermo mechanical coupling, are also addressed. The suitability of the overall approach is illustrated by the solution of an industrially relevant problem.

  20. Computer simulations for the adsorption of polymers onto surfaces

    SciTech Connect

    Balazs, A.C.

    1992-01-01

    The objective is to develop theoretical models and computer simulations to examine the adsorption of polymers onto a variety of surfaces, and to understand how the chain architecture and conditions such as the surface or solvent affect the extent of adsorption and the morphology of the interfacial layers. Results obtained last year are summarized under the following headings: behavior of copolymers at a liquid-liquid interface, grafted homopolymer chains in a poor solvent, amphiphilic comb copolymers in oil/water solutions, modeling polymer adsorption onto influenza virus, and behavior of polymers in concentrated surfactant solutions. Plans for future work are also given. 17 refs. (DLC)

  1. Extending a Flight Management Computer for Simulation and Flight Experiments

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.; Sugden, Paul C.

    2005-01-01

    In modern transport aircraft, the flight management computer (FMC) has evolved from a flight planning aid to an important hub for pilot information and origin-to-destination optimization of flight performance. Current trends indicate increasing roles of the FMC in aviation safety, aviation security, increasing airport capacity, and improving environmental impact from aircraft. Related research conducted at the Langley Research Center (LaRC) often requires functional extension of a modern, full-featured FMC. Ideally, transport simulations would include an FMC simulation that could be tailored and extended for experiments. However, due to the complexity of a modern FMC, a large investment (millions of dollars over several years) and scarce domain knowledge are needed to create such a simulation for transport aircraft. As an intermediate alternative, the Flight Research Services Directorate (FRSD) at LaRC created a set of reusable software products to extend flight management functionality upstream of a Boeing-757 FMC, transparently simulating or sharing its operator interfaces. The paper details the design of these products and highlights their use on NASA projects.

  2. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  3. Using Microcomputer Simulations in the Classroom: Examples from Undergraduate and Faculty Computer Literacy Courses.

    ERIC Educational Resources Information Center

    Hart, Jeffrey A.

    Examples of the use of computer simulations in two undergraduate courses, (American Foreign Policy and Introduction to International Politics), and a faculty computer literacy course on simulations and artificial intelligence, are provided in this compilation of various instructional items. A list of computer simulations available for various…

  4. I Sing the Body Electric: Students Use Computer Simulations To Enhance their Understanding of Human Physiology.

    ERIC Educational Resources Information Center

    Coleman, Frances

    1998-01-01

    Describes how computer simulations can enhance students' learning of physiology. Discusses how computer models enhance experimentation; using computer modeling in high school science; three steps in students' writing of a simulation; and the value of simulations. Lists six software vendors who offer packages on the PC or Macintosh platforms. (AEF)

  5. Computer simulation of coal preparation plants. Part 2. User's manual. Final report

    SciTech Connect

    Gottfried, B.S.; Tierney, J.W.

    1985-12-01

    This report describes a comprehensive computer program that allows the user to simulate the performance of realistic coal preparation plants. The program is very flexible in the sense that it can accommodate any particular plant configuration that may be of interest. This allows the user to compare the performance of different plant configurations and to determine the impact of various modes of operation with the same configuration. In addition, the program can be used to assess the degree of cleaning obtained with different coal feeds for a given plant configuration and a given mode of operation. Use of the simulator requires that the user specify the appearance of the plant configuration, the plant operating conditions, and a description of the coal feed. The simulator will then determine the flowrates within the plant, and a description of each flowrate (i.e., the weight distribution, percent ash, pyritic sulfur and total sulfur, moisture, and Btu content). The simulation program has been written in modular form using the Fortran language. It can be implemented on a great many different types of computers, ranging from large scientific mainframes to IBM-type personal computers with a fixed disk. Some customization may be required, however, to ensure compatibility with the features of Fortran available on a particular computer. Part I of this report contains a general description of the methods used to carry out the simulation. Each of the major types of units is described separately, in addition to a description of the overall system analysis. Part II is intended as a user's manual. It contains a listing of the mainframe version of the program, instructions for its use (on both a mainframe and a microcomputer), and output for a representative sample problem.

  6. The effects of computer simulation versus hands-on dissection and the placement of computer simulation within the learning cycle on student achievement and attitude

    NASA Astrophysics Data System (ADS)

    Hopkins, Kathryn Susan

    The value of dissection as an instructional strategy has been debated, but not evidenced in research literature. The purpose of this study was to examine the efficacy of using computer simulated frog dissection as a substitute for traditional hands-on frog dissection and to examine the possible enhancement of achievement by combining the two strategies in a specific sequence. In this study, 134 biology students at two Central Texas schools were divided into the five following treatment groups: computer simulation of frog dissection, computer simulation before dissection, traditional hands-on frog dissection, dissection before computer simulation, and textual worksheet materials. The effects on achievement were evaluated by labeling 10 structures on three diagrams, identifying 11 pinned structures on a prosected frog, and answering 9 multiple-choice questions over the dissection process. Attitude was evaluated using a thirty item survey with a five-point Likert scale. The quasi-experimental design was pretest/post-test/post-test nonequivalent group for both control and experimental groups, a 2 x 2 x 5 completely randomized factorial design (gender, school, five treatments). The pretest/post-test design was incorporated to control for prior knowledge using analysis of covariance. The dissection only group evidenced a significantly higher performance than all other treatments except dissection-then-computer on the post-test segment requiring students to label pinned anatomical parts on a prosected frog. Interactions between treatment and school in addition to interaction between treatment and gender were found to be significant. The diagram and attitude post-tests evidenced no significant difference. Results on the nine multiple-choice questions about dissection procedures indicated a significant difference between schools. The interaction between treatment and school was also found to be significant. On a delayed post-test, a significant difference in gender was

  7. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Adel Sarofim; Connie Senior

    2004-12-22

    , immersive environment. The Virtual Engineering Framework (VEF), in effect a prototype framework, was developed through close collaboration with NETL supported research teams from Iowa State University Virtual Reality Applications Center (ISU-VRAC) and Carnegie Mellon University (CMU). The VEF is open source, compatible across systems ranging from inexpensive desktop PCs to large-scale, immersive facilities and provides support for heterogeneous distributed computing of plant simulations. The ability to compute plant economics through an interface that coupled the CMU IECM tool to the VEF was demonstrated, and the ability to couple the VEF to Aspen Plus, a commercial flowsheet modeling tool, was demonstrated. Models were interfaced to the framework using VES-Open. Tests were performed for interfacing CAPE-Open-compliant models to the framework. Where available, the developed models and plant simulations have been benchmarked against data from the open literature. The VEF has been installed at NETL. The VEF provides simulation capabilities not available in commercial simulation tools. It provides DOE engineers, scientists, and decision makers with a flexible and extensible simulation system that can be used to reduce the time, technical risk, and cost to develop the next generation of advanced, coal-fired power systems that will have low emissions and high efficiency. Furthermore, the VEF provides a common simulation system that NETL can use to help manage Advanced Power Systems Research projects, including both combustion- and gasification-based technologies.

  8. Modeling and Simulation Reliable Spacecraft On-Board Computing

    NASA Technical Reports Server (NTRS)

    Park, Nohpill

    1999-01-01

    The proposed project will investigate modeling and simulation-driven testing and fault tolerance schemes for Spacecraft On-Board Computing, thereby achieving reliable spacecraft telecommunication. A spacecraft communication system has inherent capabilities of providing multipoint and broadcast transmission, connectivity between any two distant nodes within a wide-area coverage, quick network configuration /reconfiguration, rapid allocation of space segment capacity, and distance-insensitive cost. To realize the capabilities above mentioned, both the size and cost of the ground-station terminals have to be reduced by using reliable, high-throughput, fast and cost-effective on-board computing system which has been known to be a critical contributor to the overall performance of space mission deployment. Controlled vulnerability of mission data (measured in sensitivity), improved performance (measured in throughput and delay) and fault tolerance (measured in reliability) are some of the most important features of these systems. The system should be thoroughly tested and diagnosed before employing a fault tolerance into the system. Testing and fault tolerance strategies should be driven by accurate performance models (i.e. throughput, delay, reliability and sensitivity) to find an optimal solution in terms of reliability and cost. The modeling and simulation tools will be integrated with a system architecture module, a testing module and a module for fault tolerance all of which interacting through a centered graphical user interface.

  9. Computer simulations of glasses: the potential energy landscape

    NASA Astrophysics Data System (ADS)

    Raza, Zamaan; Alling, Björn; Abrikosov, Igor A.

    2015-07-01

    We review the current state of research on glasses, discussing the theoretical background and computational models employed to describe them. This article focuses on the use of the potential energy landscape (PEL) paradigm to account for the phenomenology of glassy systems, and the way in which it can be applied in simulations and the interpretation of their results. This article provides a broad overview of the rich phenomenology of glasses, followed by a summary of the theoretical frameworks developed to describe this phenomonology. We discuss the background of the PEL in detail, the onerous task of how to generate computer models of glasses, various methods of analysing numerical simulations, and the literature on the most commonly used model systems. Finally, we tackle the problem of how to distinguish a good glass former from a good crystal former from an analysis of the PEL. In summarising the state of the potential energy landscape picture, we develop the foundations for new theoretical methods that allow the ab initio prediction of the glass-forming ability of new materials by analysis of the PEL.

  10. Computer simulations of glasses: the potential energy landscape.

    PubMed

    Raza, Zamaan; Alling, Björn; Abrikosov, Igor A

    2015-07-29

    We review the current state of research on glasses, discussing the theoretical background and computational models employed to describe them. This article focuses on the use of the potential energy landscape (PEL) paradigm to account for the phenomenology of glassy systems, and the way in which it can be applied in simulations and the interpretation of their results. This article provides a broad overview of the rich phenomenology of glasses, followed by a summary of the theoretical frameworks developed to describe this phenomonology. We discuss the background of the PEL in detail, the onerous task of how to generate computer models of glasses, various methods of analysing numerical simulations, and the literature on the most commonly used model systems. Finally, we tackle the problem of how to distinguish a good glass former from a good crystal former from an analysis of the PEL. In summarising the state of the potential energy landscape picture, we develop the foundations for new theoretical methods that allow the ab initio prediction of the glass-forming ability of new materials by analysis of the PEL. PMID:26139691

  11. Computer simulation of orthognathic surgery with video imaging

    NASA Astrophysics Data System (ADS)

    Sader, Robert; Zeilhofer, Hans-Florian U.; Horch, Hans-Henning

    1994-04-01

    Patients with extreme jaw imbalance must often undergo operative corrections. The goal of therapy is to harmonize the stomatognathic system and an aesthetical correction of the face profile. A new procedure will be presented which supports the maxillo-facial surgeon in planning the operation and which also presents the patient the result of the treatment by video images. Once an x-ray has been digitized it is possible to produce individualized cephalometric analyses. Using a ceph on screen, all current orthognathic operations can be simulated, whereby the bony segments are moved according to given parameters, and a new soft tissue profile can be calculated. The profile of the patient is fed into the computer by way of a video system and correlated to the ceph. Using the simulated operation the computer calculates a new video image of the patient which presents the expected postoperative appearance. In studies of patients treated between 1987-91, 76 out of 121 patients were able to be evaluated. The deviation in profile change varied between .0 and 1.6mm. A side effect of the practical applications was an increase in patient compliance.

  12. Deviated nasal septum hinders intranasal sprays: A computer simulation study

    PubMed Central

    Frank, Dennis O.; Kimbell, Julia S.; Cannon, Daniel; Pawar, Sachin S.; Rhee, John S.

    2013-01-01

    Background This study investigates how deviated nasal septum affects the quantity and distribution of spray particles, and examines the effects of inspiratory airflow and head position on particle transport. Methods Deposition of spray particles was analysed using a three-dimensional computational fluid dynamics model created from a computed tomography scan of a human nose with leftward septal deviation and a right inferior turbinate hypertrophy. Five simulations were conducted using Fluent™ software, with particle sizes ranging from 20-110μm, a spray speed of 3m/s, plume angle of 68°, and with steady state inspiratory airflow either present (15.7L/min) or absent at varying head positions. Results With inspiratory airflow present, posterior deposition on the obstructed side was approximately four times less than the contralateral side, regardless of head position, and was statistically significant (p<0.05). When airflow was absent, predicted deposition beyond the nasal valve on the left and right sides were between 16% and 69% lower and positively influenced by a dependent head position. Conclusions Simulations predicted that septal deviation significantly diminished drug delivery on the obstructed side. Furthermore, increased particle penetration was associated with presence of nasal airflow. Head position is an important factor in particle deposition patterns when inspiratory airflow is absent. PMID:22888490

  13. Methods for increased computational efficiency of multibody simulations

    NASA Astrophysics Data System (ADS)

    Epple, Alexander

    This thesis is concerned with the efficient numerical simulation of finite element based flexible multibody systems. Scaling operations are systematically applied to the governing index-3 differential algebraic equations in order to solve the problem of ill conditioning for small time step sizes. The importance of augmented Lagrangian terms is demonstrated. The use of fast sparse solvers is justified for the solution of the linearized equations of motion resulting in significant savings of computational costs. Three time stepping schemes for the integration of the governing equations of flexible multibody systems are discussed in detail. These schemes are the two-stage Radau IIA scheme, the energy decaying scheme, and the generalized-a method. Their formulations are adapted to the specific structure of the governing equations of flexible multibody systems. The efficiency of the time integration schemes is comprehensively evaluated on a series of test problems. Formulations for structural and constraint elements are reviewed and the problem of interpolation of finite rotations in geometrically exact structural elements is revisited. This results in the development of a new improved interpolation algorithm, which preserves the objectivity of the strain field and guarantees stable simulations in the presence of arbitrarily large rotations. Finally, strategies for the spatial discretization of beams in the presence of steep variations in cross-sectional properties are developed. These strategies reduce the number of degrees of freedom needed to accurately analyze beams with discontinuous properties, resulting in improved computational efficiency.

  14. GUI for Computational Simulation of a Propellant Mixer

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Richter, Hanz; Barbieri, Enrique; Granger, Jamie

    2005-01-01

    Control Panel is a computer program that generates a graphical user interface (GUI) for computational simulation of a rocket-test-stand propellant mixer in which gaseous hydrogen (GH2) is injected into flowing liquid hydrogen (LH2) to obtain a combined flow having desired thermodynamic properties. The GUI is used in conjunction with software that models the mixer as a system having three inputs (the positions of the GH2 and LH2 inlet valves and an outlet valve) and three outputs (the pressure inside the mixer and the outlet flow temperature and flow rate). The user can specify valve characteristics and thermodynamic properties of the input fluids via userfriendly dialog boxes. The user can enter temporally varying input values or temporally varying desired output values. The GUI provides (1) a set-point calculator function for determining fixed valve positions that yield desired output values and (2) simulation functions that predict the response of the mixer to variations in the properties of the LH2 and GH2 and manual- or feedback-control variations in valve positions. The GUI enables scheduling of a sequence of operations that includes switching from manual to feedback control when a certain event occurs.

  15. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  16. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  17. Computer Simulations of Small Molecules in Membranes: Insights from Computer Simulations into the Interactions of Small Molecules with Lipid Bilayers

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; New, Michael H.; Schweighofer, Karl; Wilson, Michael A.; DeVincenzi, Donald L. (Technical Monitor)

    2000-01-01

    Two of Ernest Overton's lasting contributions to biology are the Meyer-Overton relationship between the potency of an anesthetic and its solubility in oil, and the Overton rule which relates the permeability of a membrane to the oil-water partition coefficient of the permeating molecule. A growing body of experimental evidence, however, cannot be reconciled with these theories. In particular, the molecular nature of membranes, unknown to Overton, needs to be included in any description of these phenomena. Computer simulations are ideally suited for providing atomic-level information about the behavior of small molecules in membranes. The authors discuss simulation studies relevant to Overton's ideas. Through simulations it was found that anesthetics tend to concentrate at interfaces and their anesthetic potency correlates better with solubility at the water-membrane interface than with solubility in oil. Simulation studies of membrane permeation revealed the anisotropic nature of the membranes, as evidenced, for example, by the highly nonuniform distribution of free volume in the bilayer. This, in turn, influences the diffusion rates of solutes, which increase with the depth in the membrane. Small solutes tend to move by hopping between voids in the bilayer, and this hopping motion may be responsible for the deviation from the Overton rule of the permeation rates of these molecules.

  18. Single-computer HWIL simulation facility for real-time vision systems

    NASA Astrophysics Data System (ADS)

    Fuerst, Simon; Werner, Stefan; Dickmanns, Ernst D.

    1998-07-01

    UBM is working on autonomous vision systems for aircraft for more than one and a half decades by now. The systems developed use standard on-board sensors and two additional monochrome cameras for state estimation of the aircraft. A common task is to detect and track a runway for an autonomous landing approach. The cameras have different focal lengths and are mounted on a special pan and tilt camera platform. As the platform is equipped with two resolvers and two gyros it can be stabilized inertially and the system has the ability to actively focus on the objects of highest interest. For verification and testing, UBM has a special HWIL simulation facility for real-time vision systems. Central part of this simulation facility is a three axis motion simulator (DBS). It is used to realize the computed orientation in the rotational degrees of freedom of the aircraft. The two-axis camera platform with its two CCD-cameras is mounted on the inner frame of the DBS and is pointing at the cylindrical projection screen with a synthetic view displayed on it. As the performance of visual perception systems has increased significantly in recent years, a new, more powerful synthetic vision system was required. A single Onyx2 machine replaced all the former simulation computers. This computer is powerful enough to simulate the aircraft, to generate a high-resolution synthetic view, to control the DBS and to communicate with the image processing computers. Further improvements are the significantly reduced delay times for closed loop simulations and the elimination of communication overhead.

  19. Real-Time Simulation Computation System. [for digital flight simulation of research aircraft

    NASA Technical Reports Server (NTRS)

    Fetter, J. L.

    1981-01-01

    The Real-Time Simulation Computation System, which will provide the flexibility necessary for operation in the research environment at the Ames Research Center is discussed. Designing the system with common subcomponents and using modular construction techniques enhances expandability and maintainability qualities. The 10-MHz series transmission scheme is the basis of the Input/Output Unit System and is the driving force providing the system flexibility. Error checking and detection performed on the transmitted data provide reliability measurements and assurances that accurate data are received at the simulators.

  20. A computer program for simulating geohydrologic systems in three dimensions

    USGS Publications Warehouse

    Posson, D.R.; Hearne, G.A.; Tracy, J.V.; Frenzel, P.F.

    1980-01-01

    This document is directed toward individuals who wish to use a computer program to simulate ground-water flow in three dimensions. The strongly implicit procedure (SIP) numerical method is used to solve the set of simultaneous equations. New data processing techniques and program input and output options are emphasized. The quifer system to be modeled may be heterogeneous and anisotropic, and may include both artesian and water-table conditions. Systems which consist of well defined alternating layers of highly permeable and poorly permeable material may be represented by a sequence of equations for two dimensional flow in each of the highly permeable units. Boundaries where head or flux is user-specified may be irregularly shaped. The program also allows the user to represent streams as limited-source boundaries when the streamflow is small in relation to the hydraulic stress on the system. The data-processing techniques relating to ' cube ' input and output, to swapping of layers, to restarting of simulation, to free-format NAMELIST input, to the details of each sub-routine 's logic, and to the overlay program structure are discussed. The program is capable of processing large models that might overflow computer memories with conventional programs. Detailed instructions for selecting program options, for initializing the data arrays, for defining ' cube ' output lists and maps, and for plotting hydrographs of calculated and observed heads and/or drawdowns are provided. Output may be restricted to those nodes of particular interest, thereby reducing the volumes of printout for modelers, which may be critical when working at remote terminals. ' Cube ' input commands allow the modeler to set aquifer parameters and initialize the model with very few input records. Appendixes provide instructions to compile the program, definitions and cross-references for program variables, summary of the FLECS structured FORTRAN programming language, listings of the FLECS and