Science.gov

Sample records for all-atom computer simulations

  1. All-atom Simulation of Amyloid Aggregates

    NASA Astrophysics Data System (ADS)

    Berhanu, Workalemahu M.; Alred, Erik J.; Bernhardt, Nathan A.; Hansmann, Ulrich H. E.

    Molecular simulations are now commonly used to complement experiments in the investigation of amyloid formation and their role in human diseases. While various simulations based on enhanced sampling techniques are used in amyloid formation simulations, this article will focus on those using standard atomistic simulations to evaluate the stability of fibril models. Such studies explore the limitations that arise from the choice of force field or polymorphism; and explore the stability of in vivo and in vitro forms of Aβ fibril aggregates, and the role of heterologous seeding as a link between different amyloid diseases.

  2. Microtubule Elasticity: Connecting All-Atom Simulations with Continuum Mechanics

    NASA Astrophysics Data System (ADS)

    Sept, David; Mackintosh, Fred C.

    2010-01-01

    The mechanical properties of microtubules have been extensively studied using a wide range of biophysical techniques, seeking to understand the mechanics of these cylindrical polymers. Here we develop a method for connecting all-atom molecular dynamics simulations with continuum mechanics and show how this can be applied to understand microtubule mechanics. Our coarse-graining technique applied to the microscopic simulation system yields consistent predictions for the Young’s modulus and persistence length of microtubules, while clearly demonstrating how binding of the drug Taxol decreases the stiffness of microtubules. The techniques we develop should be widely applicable to other macromolecular systems.

  3. All-atom simulations of crowding effects on ubiquitin dynamics

    NASA Astrophysics Data System (ADS)

    Abriata, Luciano A.; Spiga, Enrico; Dal Peraro, Matteo

    2013-08-01

    It is well-known that crowded environments affect the stability of proteins, with strong biological and biotechnological implications; however, beyond this, crowding is also expected to affect the dynamic properties of proteins, an idea that is hard to probe experimentally. Here we report on a simulation study aimed at evaluating the effects of crowding on internal protein dynamics, based on fully all-atom descriptions of the protein, the solvent and the crowder. Our model system consists of ubiquitin, a protein whose dynamic features are closely related to its ability to bind to multiple partners, in a 325 g L-1 solution of glucose in water, a condition widely employed in in vitro studies of crowding effects. We observe a slight reduction in loop flexibility accompanied by a dramatic restriction of the conformational space explored in the timescale of the simulations (˜0.5 µs), indicating that crowding slows down collective motions and the rate of exploration of the conformational space. This effect is attributed to the extensive and long-lasting interactions observed between protein residues and glucose molecules throughout the entire protein surface. Potential implications of the observed effects are discussed.

  4. Coupling all-atom molecular dynamics simulations of ions in water with Brownian dynamics

    PubMed Central

    2016-01-01

    Molecular dynamics (MD) simulations of ions (K+, Na+, Ca2+ and Cl−) in aqueous solutions are investigated. Water is described using the SPC/E model. A stochastic coarse-grained description for ion behaviour is presented and parametrized using MD simulations. It is given as a system of coupled stochastic and ordinary differential equations, describing the ion position, velocity and acceleration. The stochastic coarse-grained model provides an intermediate description between all-atom MD simulations and Brownian dynamics (BD) models. It is used to develop a multiscale method which uses all-atom MD simulations in parts of the computational domain and (less detailed) BD simulations in the remainder of the domain. PMID:27118886

  5. Beyond Modeling: All-Atom Olfactory Receptor Model Simulations

    PubMed Central

    Lai, Peter C.; Crasto, Chiquito J.

    2012-01-01

    Olfactory receptors (ORs) are a type of GTP-binding protein-coupled receptor (GPCR). These receptors are responsible for mediating the sense of smell through their interaction with odor ligands. OR-odorant interactions marks the first step in the process that leads to olfaction. Computational studies on model OR structures can generate focused and novel hypotheses for further bench investigation by providing a view of these interactions at the molecular level beyond inferences that are drawn merely from static docking. Here we have shown the specific advantages of simulating the dynamic environment associated with OR-odorant interactions. We present a rigorous protocol which ranges from the creation of a computationally derived model of an olfactory receptor to simulating the interactions between an OR and an odorant molecule. Given the ubiquitous occurrence of GPCRs in the membranes of cells, we anticipate that our OR-developed methodology will serve as a model for the computational structural biology of all GPCRs. PMID:22563330

  6. All-atom crystal simulations of DNA and RNA duplexes

    PubMed Central

    Liu, Chunmei; Janowski, Pawel A.; Case, David A.

    2014-01-01

    Background Molecular dynamics simulations can complement experimental measures of structure and dynamics of biomolecules. The quality of such simulations can be tested by comparisons to models refined against experimental crystallographic data. Methods We report simulations of a DNA and RNA duplex in their crystalline environment. The calculations mimic the conditions for PDB entries 1D23 [d(CGATCGATCG)2] and 1RNA [(UUAUAUAUAUAUAA)2], and contain 8 unit cells, each with 4 copies of the Watson-Crick duplex; this yields in aggregate 64 µs of duplex sampling for DNA and 16 µs for RNA. Results The duplex structures conform much more closely to the average structure seen in the crystal than do structures extracted from a solution simulation with the same force field. Sequence-dependent variations in helical parameters, and in groove widths, are largely maintained in the crystal structure, but are smoothed out in solution. However, the integrity of the crystal lattice is slowly degraded in both simulations, with the result that the interfaces between chains become heterogeneous. This problem is more severe for the DNA crystal, which has fewer inter-chain hydrogen bond contacts than does the RNA crystal. Conclusions Crystal simulations using current force fields reproduce many features of observed crystal structures, but suffer from a gradual degradation of the integrity of the crystal lattice. General significance The results offer insights into force-field simulations that tests their ability to preserve weak interactions between chains, which will be of importance also in non-crystalline applications that involve binding and recognition. PMID:25255706

  7. Benchmarking all-atom simulations using hydrogen exchange

    PubMed Central

    Skinner, John J.; Yu, Wookyung; Gichana, Elizabeth K.; Baxa, Michael C.; Hinshaw, James R.; Freed, Karl F.; Sosnick, Tobin R.

    2014-01-01

    Long-time molecular dynamics (MD) simulations are now able to fold small proteins reversibly to their native structures [Lindorff-Larsen K, Piana S, Dror RO, Shaw DE (2011) Science 334(6055):517–520]. These results indicate that modern force fields can reproduce the energy surface near the native structure. To test how well the force fields recapitulate the other regions of the energy surface, MD trajectories for a variant of protein G are compared with data from site-resolved hydrogen exchange (HX) and other biophysical measurements. Because HX monitors the breaking of individual H-bonds, this experimental technique identifies the stability and H-bond content of excited states, thus enabling quantitative comparison with the simulations. Contrary to experimental findings of a cooperative, all-or-none unfolding process, the simulated denatured state ensemble, on average, is highly collapsed with some transient or persistent native 2° structure. The MD trajectories of this protein G variant and other small proteins exhibit excessive intramolecular H-bonding even for the most expanded conformations, suggesting that the force fields require improvements in describing H-bonding and backbone hydration. Moreover, these comparisons provide a general protocol for validating the ability of simulations to accurately capture rare structural fluctuations. PMID:25349413

  8. ALMOST: an all atom molecular simulation toolkit for protein structure determination.

    PubMed

    Fu, Biao; Sahakyan, Aleksandr B; Camilloni, Carlo; Tartaglia, Gian Gaetano; Paci, Emanuele; Caflisch, Amedeo; Vendruscolo, Michele; Cavalli, Andrea

    2014-05-30

    Almost (all atom molecular simulation toolkit) is an open source computational package for structure determination and analysis of complex molecular systems including proteins, and nucleic acids. Almost has been designed with two primary goals: to provide tools for molecular structure determination using various types of experimental measurements as conformational restraints, and to provide methods for the analysis and assessment of structural and dynamical properties of complex molecular systems. The methods incorporated in Almost include the determination of structural and dynamical features of proteins using distance restraints derived from nuclear Overhauser effect measurements, orientational restraints obtained from residual dipolar couplings and the structural restraints from chemical shifts. Here, we present the first public release of Almost, highlight the key aspects of its computational design and discuss the main features currently implemented. Almost is available for the most common Unix-based operating systems, including Linux and Mac OS X. Almost is distributed free of charge under the GNU Public License, and is available both as a source code and as a binary executable from the project web site at http://www.open-almost.org. Interested users can follow and contribute to the further development of Almost on http://sourceforge.net/projects/almost.

  9. ALMOST: an all atom molecular simulation toolkit for protein structure determination.

    PubMed

    Fu, Biao; Sahakyan, Aleksandr B; Camilloni, Carlo; Tartaglia, Gian Gaetano; Paci, Emanuele; Caflisch, Amedeo; Vendruscolo, Michele; Cavalli, Andrea

    2014-05-30

    Almost (all atom molecular simulation toolkit) is an open source computational package for structure determination and analysis of complex molecular systems including proteins, and nucleic acids. Almost has been designed with two primary goals: to provide tools for molecular structure determination using various types of experimental measurements as conformational restraints, and to provide methods for the analysis and assessment of structural and dynamical properties of complex molecular systems. The methods incorporated in Almost include the determination of structural and dynamical features of proteins using distance restraints derived from nuclear Overhauser effect measurements, orientational restraints obtained from residual dipolar couplings and the structural restraints from chemical shifts. Here, we present the first public release of Almost, highlight the key aspects of its computational design and discuss the main features currently implemented. Almost is available for the most common Unix-based operating systems, including Linux and Mac OS X. Almost is distributed free of charge under the GNU Public License, and is available both as a source code and as a binary executable from the project web site at http://www.open-almost.org. Interested users can follow and contribute to the further development of Almost on http://sourceforge.net/projects/almost. PMID:24676684

  10. Examining the origins of the hydration force between lipid bilayers using all-atom simulations.

    PubMed

    Gentilcore, Anastasia N; Michaud-Agrawal, Naveen; Crozier, Paul S; Stevens, Mark J; Woolf, Thomas B

    2010-05-01

    Using 237 all-atom double bilayer simulations, we examined the thermodynamic and structural changes that occur as a phosphatidylcholine lipid bilayer stack is dehydrated. The simulated system represents a micropatch of lipid multilayer systems that are studied experimentally using surface force apparatus, atomic force microscopy and osmotic pressure studies. In these experiments, the hydration level of the system is varied, changing the separation between the bilayers, in order to understand the forces that the bilayers feel as they are brought together. These studies have found a curious, strongly repulsive force when the bilayers are very close to each other, which has been termed the "hydration force," though the origins of this force are not clearly understood. We computationally reproduce this repulsive, relatively free energy change as bilayers come together and make qualitative conclusions as to the enthalpic and entropic origins of the free energy change. This analysis is supported by data showing structural changes in the waters, lipids and salts that have also been seen in experimental work. Increases in solvent ordering as the bilayers are dehydrated are found to be essential in causing the repulsion as the bilayers come together.

  11. Resolution-Adapted All-Atomic and Coarse-Grained Model for Biomolecular Simulations.

    PubMed

    Shen, Lin; Hu, Hao

    2014-06-10

    We develop here an adaptive multiresolution method for the simulation of complex heterogeneous systems such as the protein molecules. The target molecular system is described with the atomistic structure while maintaining concurrently a mapping to the coarse-grained models. The theoretical model, or force field, used to describe the interactions between two sites is automatically adjusted in the simulation processes according to the interaction distance/strength. Therefore, all-atomic, coarse-grained, or mixed all-atomic and coarse-grained models would be used together to describe the interactions between a group of atoms and its surroundings. Because the choice of theory is made on the force field level while the sampling is always carried out in the atomic space, the new adaptive method preserves naturally the atomic structure and thermodynamic properties of the entire system throughout the simulation processes. The new method will be very useful in many biomolecular simulations where atomistic details are critically needed.

  12. Dissociation of a Dynamic Protein Complex Studied by All-Atom Molecular Simulations.

    PubMed

    Zhang, Liqun; Borthakur, Susmita; Buck, Matthias

    2016-02-23

    The process of protein complex dissociation remains to be understood at the atomic level of detail. Computers now allow microsecond timescale molecular-dynamics simulations, which make the visualization of such processes possible. Here, we investigated the dissociation process of the EphA2-SHIP2 SAM-SAM domain heterodimer complex using unrestrained all-atom molecular-dynamics simulations. Previous studies on this system have shown that alternate configurations are sampled, that their interconversion can be fast, and that the complex is dynamic by nature. Starting from different NMR-derived structures, mutants were designed to stabilize a subset of configurations by swapping ion pairs across the protein-protein interface. We focused on two mutants, K956D/D1235K and R957D/D1223R, with attenuated binding affinity compared with the wild-type proteins. In contrast to calculations on the wild-type complexes, the majority of simulations of these mutants showed protein dissociation within 2.4 μs. During the separation process, we observed domain rotation and pivoting as well as a translation and simultaneous rolling, typically to alternate and weaker binding interfaces. Several unsuccessful recapturing attempts occurred once the domains were moderately separated. An analysis of protein solvation suggests that the dissociation process correlates with a progressive loss of protein-protein contacts. Furthermore, an evaluation of internal protein dynamics using quasi-harmonic and order parameter analyses indicates that changes in protein internal motions are expected to contribute significantly to the thermodynamics of protein dissociation. Considering protein association as the reverse of the separation process, the initial role of charged/polar interactions is emphasized, followed by changes in protein and solvent dynamics. The trajectories show that protein separation does not follow a single distinct pathway, but suggest that the mechanism of dissociation is common in

  13. Simulation of lipid bilayer self-assembly using all-atom lipid force fields.

    PubMed

    Skjevik, Åge A; Madej, Benjamin D; Dickson, Callum J; Lin, Charles; Teigen, Knut; Walker, Ross C; Gould, Ian R

    2016-04-21

    In this manuscript we expand significantly on our earlier communication by investigating the bilayer self-assembly of eight different types of phospholipids in unbiased molecular dynamics (MD) simulations using three widely used all-atom lipid force fields. Irrespective of the underlying force field, the lipids are shown to spontaneously form stable lamellar bilayer structures within 1 microsecond, the majority of which display properties in satisfactory agreement with the experimental data. The lipids self-assemble via the same general mechanism, though at formation rates that differ both between lipid types, force fields and even repeats on the same lipid/force field combination. In addition to zwitterionic phosphatidylcholine (PC) and phosphatidylethanolamine (PE) lipids, anionic phosphatidylserine (PS) and phosphatidylglycerol (PG) lipids are represented. To our knowledge this is the first time bilayer self-assembly of phospholipids with negatively charged head groups is demonstrated in all-atom MD simulations.

  14. All-atom simulation study of protein PTH(1-34) by using the Wang-Landau sampling method

    NASA Astrophysics Data System (ADS)

    Kim, Seung-Yeon; Kwak, Wooseop

    2014-12-01

    We perform simulations of the N-terminal 34-residue protein fragment PTH(1-34), consisting of 581 atoms, of the 84-residue human parathyroid hormone by using the all-atom ECEPP/3 force field and the Wang-Landau sampling method. Through a massive high-performance computation, the density of states and the partition function Z( T), as a continuous function of T, are obtained for PTH(1-34). From the continuous partition function Z( T), the partition function zeros of PTH(1-34) are evaluated for the first time. From both the specific heat and the partition function zeros, two characteristic transition temperatures are obtained for the all-atom protein PTH(1-34). The higher transition temperature T 1 and the lower transition temperature T 2 of PTH(1-34) can be interpreted as the collapse temperature T θ and the folding temperature T f , respectively.

  15. Dissociation of a Dynamic Protein Complex Studied by All-Atom Molecular Simulations.

    PubMed

    Zhang, Liqun; Borthakur, Susmita; Buck, Matthias

    2016-02-23

    The process of protein complex dissociation remains to be understood at the atomic level of detail. Computers now allow microsecond timescale molecular-dynamics simulations, which make the visualization of such processes possible. Here, we investigated the dissociation process of the EphA2-SHIP2 SAM-SAM domain heterodimer complex using unrestrained all-atom molecular-dynamics simulations. Previous studies on this system have shown that alternate configurations are sampled, that their interconversion can be fast, and that the complex is dynamic by nature. Starting from different NMR-derived structures, mutants were designed to stabilize a subset of configurations by swapping ion pairs across the protein-protein interface. We focused on two mutants, K956D/D1235K and R957D/D1223R, with attenuated binding affinity compared with the wild-type proteins. In contrast to calculations on the wild-type complexes, the majority of simulations of these mutants showed protein dissociation within 2.4 μs. During the separation process, we observed domain rotation and pivoting as well as a translation and simultaneous rolling, typically to alternate and weaker binding interfaces. Several unsuccessful recapturing attempts occurred once the domains were moderately separated. An analysis of protein solvation suggests that the dissociation process correlates with a progressive loss of protein-protein contacts. Furthermore, an evaluation of internal protein dynamics using quasi-harmonic and order parameter analyses indicates that changes in protein internal motions are expected to contribute significantly to the thermodynamics of protein dissociation. Considering protein association as the reverse of the separation process, the initial role of charged/polar interactions is emphasized, followed by changes in protein and solvent dynamics. The trajectories show that protein separation does not follow a single distinct pathway, but suggest that the mechanism of dissociation is common in

  16. An all-atom simulation study of the ordering of liquid squalane near a solid surface

    NASA Astrophysics Data System (ADS)

    Tsige, Mesfin; Patnaik, Soumya S.

    2008-05-01

    An all-atom molecular dynamics study using the OPLS force field has been carried out to obtain new insights in to the orientation and ordering of liquid squalane near a solid surface. As observed in previous experiments, the squalane molecules closest to a SiO 2 substrate are found to be tightly bound with their molecular axis preferentially parallel to the interface. Unlike linear alkanes, the squalane molecules are also found to lie preferentially parallel to the liquid/vapor interface. The simulation results predict that the molecular plane orientation of the squalane molecules changes from mainly parallel to perpendicular to the substrate in going further away from the substrate.

  17. Picosecond infrared laser-induced all-atom nonequilibrium molecular dynamics simulation of dissociation of viruses.

    PubMed

    Hoang Man, Viet; Van-Oanh, Nguyen-Thi; Derreumaux, Philippe; Li, Mai Suan; Roland, Christopher; Sagui, Celeste; Nguyen, Phuong H

    2016-04-28

    Since the discovery of the plant pathogen tobacco mosaic virus as the first viral entity in the late 1800s, viruses traditionally have been mainly thought of as pathogens for disease-resistances. However, viruses have recently been exploited as nanoplatforms with applications in biomedicine and materials science. To this aim, a large majority of current methods and tools have been developed to improve the physical stability of viral particles, which may be critical to the extreme physical or chemical conditions that viruses may encounter during purification, fabrication processes, storage and use. However, considerably fewer studies are devoted to developing efficient methods to degrade or recycle such enhanced stability biomaterials. With this in mind, we carry out all-atom nonequilibrium molecular dynamics simulation, inspired by the recently developed mid-infrared free-electron laser pulse technology, to dissociate viruses. Adopting the poliovirus as a representative example, we find that the primary step in the dissociation process is due to the strong resonance between the amide I vibrational modes of the virus and the tuned laser frequencies. This process is determined by a balance between the formation and dissociation of the protein shell, reflecting the highly plasticity of the virus. Furthermore, our method should provide a feasible approach to simulate viruses, which is otherwise too expensive for conventional equilibrium all-atom simulations of such very large systems. Our work shows a proof of concept which may open a new, efficient way to cleave or to recycle virus-based materials, provide an extremely valuable tool for elucidating mechanical aspects of viruses, and may well play an important role in future fighting against virus-related diseases.

  18. Accelerating All-Atom MD Simulations of Lipids Using a Modified Virtual-Sites Technique.

    PubMed

    Loubet, Bastien; Kopec, Wojciech; Khandelia, Himanshu

    2014-12-01

    We present two new implementations of the virtual sites technique which completely suppresses the degrees of freedom of the hydrogen atoms in a lipid bilayer allowing for an increased time step of 5 fs in all-atom simulations of the CHARMM36 force field. One of our approaches uses the derivation of the virtual sites used in GROMACS while the other uses a new definition of the virtual sites of the CH2 groups. Our methods is tested on a DPPC (no unsaturated chain), a POPC (one unsaturated chain), and a DOPC (two unsaturated chains) lipid bilayers. We calculate various physical properties of the membrane of our simulations with and without virtual sites and explain the differences and similarity observed. The best agreements are obtained for the GROMACS original virtual sites on the DOPC bilayer where we get an area per lipid of 67.3 ± 0.3 Å(2) without virtual sites and 67.6 ± 0.3 Å(2) with virtual sites. In conclusion the virtual-sites technique on lipid membranes is a powerful simulation tool, but it should be used with care. The procedure can be applied to other force fields and lipids in a straightforward manner.

  19. All-atom molecular dynamics simulation of a photosystem i/detergent complex.

    PubMed

    Harris, Bradley J; Cheng, Xiaolin; Frymier, Paul

    2014-10-01

    All-atom molecular dynamics (MD) simulation was used to investigate the solution structure and dynamics of the photosynthetic pigment-protein complex photosystem I (PSI) from Thermosynechococcus elongatus embedded in a toroidal belt of n-dodecyl-β-d-maltoside (DDM) detergent. Evaluation of root-mean-square deviations (RMSDs) relative to the known crystal structure show that the protein complex surrounded by DDM molecules is stable during the 200 ns simulation time, and root-mean-square fluctuation (RMSF) analysis indicates that regions of high local mobility correspond to solvent-exposed regions such as turns in the transmembrane α-helices and flexible loops on the stromal and lumenal faces. Comparing the protein-detergent complex to a pure detergent micelle, the detergent surrounding the PSI trimer is found to be less densely packed but with more ordered detergent tails, contrary to what is seen in most lipid bilayer models. We also investigated any functional implications for the observed conformational dynamics and protein-detergent interactions, discovering interesting structural changes in the psaL subunits associated with maintaining the trimeric structure of the protein. Importantly, we find that the docking of soluble electron mediators such as cytochrome c6 and ferredoxin to PSI is not significantly impacted by the solubilization of PSI in detergent.

  20. Effect of calcium and magnesium on phosphatidylserine membranes: experiments and all-atomic simulations.

    PubMed

    Martín-Molina, Alberto; Rodríguez-Beas, César; Faraudo, Jordi

    2012-05-01

    It is known that phosphatidylserine (PS(-)) lipids have a very similar affinity for Ca(2+) and Mg(2+) cations, as revealed by electrokinetic and stability experiments. However, despite this similar affinity, experimental evidence shows that the presence of Ca(2+) or Mg(2+) induces very different aggregation behavior for PS(-) liposomes as characterized by their fractal dimensions. Also, turbidity measurements confirm substantial differences in aggregation behavior depending on the presence of Ca(2+) or Mg(2+) cations. These puzzling results suggest that although these two cations have a similar affinity for PS(-) lipids, they induce substantial structural differences in lipid bilayers containing each of these cations. In other words, these cations have strong ion-specific effects on the structure of PS(-) membranes. This interpretation is supported by all-atomic molecular-dynamics simulations showing that Ca(2+) and Mg(2+) cations have different binding sites and induce different membrane hydration. We show that although both ions are incorporated deep into the hydrophilic region of the membrane, they have different positions and configurations at the membrane. Absorbed Ca(2+) cations present a peak at a distance ~2 nm from the center of the lipid bilayer, and their most probable binding configuration involves two oxygen atoms from each of the charged moieties of the PS molecule (phosphate and carboxyl groups). In contrast, the distribution of absorbed Mg(2+) cations has two different peaks, located a few angstroms before and after the Ca(2+) peak. The most probable configurations (corresponding to these two peaks) involve binding to two oxygen atoms from carboxyl groups (the most superficial binding peak) or two oxygen atoms from phosphate groups (the most internal peak). Moreover, simulations also show differences in the hydration structure of the membrane: we obtained a hydration of 7.5 and 9 water molecules per lipid in simulations with Ca(2+) and Mg(2

  1. Effect of calcium and magnesium on phosphatidylserine membranes: experiments and all-atomic simulations.

    PubMed

    Martín-Molina, Alberto; Rodríguez-Beas, César; Faraudo, Jordi

    2012-05-01

    It is known that phosphatidylserine (PS(-)) lipids have a very similar affinity for Ca(2+) and Mg(2+) cations, as revealed by electrokinetic and stability experiments. However, despite this similar affinity, experimental evidence shows that the presence of Ca(2+) or Mg(2+) induces very different aggregation behavior for PS(-) liposomes as characterized by their fractal dimensions. Also, turbidity measurements confirm substantial differences in aggregation behavior depending on the presence of Ca(2+) or Mg(2+) cations. These puzzling results suggest that although these two cations have a similar affinity for PS(-) lipids, they induce substantial structural differences in lipid bilayers containing each of these cations. In other words, these cations have strong ion-specific effects on the structure of PS(-) membranes. This interpretation is supported by all-atomic molecular-dynamics simulations showing that Ca(2+) and Mg(2+) cations have different binding sites and induce different membrane hydration. We show that although both ions are incorporated deep into the hydrophilic region of the membrane, they have different positions and configurations at the membrane. Absorbed Ca(2+) cations present a peak at a distance ~2 nm from the center of the lipid bilayer, and their most probable binding configuration involves two oxygen atoms from each of the charged moieties of the PS molecule (phosphate and carboxyl groups). In contrast, the distribution of absorbed Mg(2+) cations has two different peaks, located a few angstroms before and after the Ca(2+) peak. The most probable configurations (corresponding to these two peaks) involve binding to two oxygen atoms from carboxyl groups (the most superficial binding peak) or two oxygen atoms from phosphate groups (the most internal peak). Moreover, simulations also show differences in the hydration structure of the membrane: we obtained a hydration of 7.5 and 9 water molecules per lipid in simulations with Ca(2+) and Mg(2

  2. Effect of Calcium and Magnesium on Phosphatidylserine Membranes: Experiments and All-Atomic Simulations

    PubMed Central

    Martín-Molina, Alberto; Rodríguez-Beas, César; Faraudo, Jordi

    2012-01-01

    It is known that phosphatidylserine (PS−) lipids have a very similar affinity for Ca2+ and Mg2+ cations, as revealed by electrokinetic and stability experiments. However, despite this similar affinity, experimental evidence shows that the presence of Ca2+ or Mg2+ induces very different aggregation behavior for PS− liposomes as characterized by their fractal dimensions. Also, turbidity measurements confirm substantial differences in aggregation behavior depending on the presence of Ca2+ or Mg2+ cations. These puzzling results suggest that although these two cations have a similar affinity for PS− lipids, they induce substantial structural differences in lipid bilayers containing each of these cations. In other words, these cations have strong ion-specific effects on the structure of PS− membranes. This interpretation is supported by all-atomic molecular-dynamics simulations showing that Ca2+ and Mg2+ cations have different binding sites and induce different membrane hydration. We show that although both ions are incorporated deep into the hydrophilic region of the membrane, they have different positions and configurations at the membrane. Absorbed Ca2+ cations present a peak at a distance ∼2 nm from the center of the lipid bilayer, and their most probable binding configuration involves two oxygen atoms from each of the charged moieties of the PS molecule (phosphate and carboxyl groups). In contrast, the distribution of absorbed Mg2+ cations has two different peaks, located a few angstroms before and after the Ca2+ peak. The most probable configurations (corresponding to these two peaks) involve binding to two oxygen atoms from carboxyl groups (the most superficial binding peak) or two oxygen atoms from phosphate groups (the most internal peak). Moreover, simulations also show differences in the hydration structure of the membrane: we obtained a hydration of 7.5 and 9 water molecules per lipid in simulations with Ca2+ and Mg2+, respectively. PMID:22824273

  3. All-Atom Molecular Dynamics Simulation of Protein Translocation through an α-Hemolysin Nanopore.

    PubMed

    Di Marino, Daniele; Bonome, Emma Letizia; Tramontano, Anna; Chinappi, Mauro

    2015-08-01

    Nanopore sensing is attracting the attention of a large and varied scientific community. One of the main issues in nanopore sensing is how to associate the measured current signals to specific features of the molecule under investigation. This is particularly relevant when the translocating molecule is a protein and the pore is sufficiently narrow to necessarily involve unfolding of the translocating protein. Recent experimental results characterized the cotranslocational unfolding of Thioredoxin (Trx) passing through an α-hemolisin pore, providing evidence for the existence of a multistep process. In this study we report the results of all-atom molecular dynamics simulations of the same system. Our data indicate that Trx translocation involves two main barriers. The first one is an unfolding barrier associated with a translocation intermediate where the N-terminal region of Trx is stuck at the pore entrance in a conformation that strongly resembles the native one. After the abrupt unfolding of the N-terminal region, the Trx enters the α-hemolisin vestibule. During this stage, the constriction is occupied not only by the translocating residue but also by a hairpin-like structure forming a tangle in the constriction. The second barrier is associated with the disentangling of this region.

  4. Analysis of Ligand-Receptor Association and Intermediate Transfer Rates in Multienzyme Nanostructures with All-Atom Brownian Dynamics Simulations.

    PubMed

    Roberts, Christopher C; Chang, Chia-En A

    2016-08-25

    We present the second-generation GeomBD Brownian dynamics software for determining interenzyme intermediate transfer rates and substrate association rates in biomolecular complexes. Substrate and intermediate association rates for a series of enzymes or biomolecules can be compared between the freely diffusing disorganized configuration and various colocalized or complexed arrangements for kinetic investigation of enhanced intermediate transfer. In addition, enzyme engineering techniques, such as synthetic protein conjugation, can be computationally modeled and analyzed to better understand changes in substrate association relative to native enzymes. Tools are provided to determine nonspecific ligand-receptor association residence times, and to visualize common sites of nonspecific association of substrates on receptor surfaces. To demonstrate features of the software, interenzyme intermediate substrate transfer rate constants are calculated and compared for all-atom models of DNA origami scaffold-bound bienzyme systems of glucose oxidase and horseradish peroxidase. Also, a DNA conjugated horseradish peroxidase enzyme was analyzed for its propensity to increase substrate association rates and substrate local residence times relative to the unmodified enzyme. We also demonstrate the rapid determination and visualization of common sites of nonspecific ligand-receptor association by using HIV-1 protease and an inhibitor, XK263. GeomBD2 accelerates simulations by precomputing van der Waals potential energy grids and electrostatic potential grid maps, and has a flexible and extensible support for all-atom and coarse-grained force fields. Simulation software is written in C++ and utilizes modern parallelization techniques for potential grid preparation and Brownian dynamics simulation processes. Analysis scripts, written in the Python scripting language, are provided for quantitative simulation analysis. GeomBD2 is applicable to the fields of biophysics, bioengineering

  5. A combined coarse-grained and all-atom simulation of TRPV1 channel gating and heat activation

    PubMed Central

    Qin, Feng

    2015-01-01

    The transient receptor potential (TRP) channels act as key sensors of various chemical and physical stimuli in eukaryotic cells. Despite years of study, the molecular mechanisms of TRP channel activation remain unclear. To elucidate the structural, dynamic, and energetic basis of gating in TRPV1 (a founding member of the TRPV subfamily), we performed coarse-grained modeling and all-atom molecular dynamics (MD) simulation based on the recently solved high resolution structures of the open and closed form of TRPV1. Our coarse-grained normal mode analysis captures two key modes of collective motions involved in the TRPV1 gating transition, featuring a quaternary twist motion of the transmembrane domains (TMDs) relative to the intracellular domains (ICDs). Our transition pathway modeling predicts a sequence of structural movements that propagate from the ICDs to the TMDs via key interface domains (including the membrane proximal domain and the C-terminal domain), leading to sequential opening of the selectivity filter followed by the lower gate in the channel pore (confirmed by modeling conformational changes induced by the activation of ICDs). The above findings of coarse-grained modeling are robust to perturbation by lipids. Finally, our MD simulation of the ICD identifies key residues that contribute differently to the nonpolar energy of the open and closed state, and these residues are predicted to control the temperature sensitivity of TRPV1 gating. These computational predictions offer new insights to the mechanism for heat activation of TRPV1 gating, and will guide our future electrophysiology and mutagenesis studies. PMID:25918362

  6. COFFDROP: A Coarse-Grained Nonbonded Force Field for Proteins Derived from All-Atom Explicit-Solvent Molecular Dynamics Simulations of Amino Acids

    PubMed Central

    2015-01-01

    We describe the derivation of a set of bonded and nonbonded coarse-grained (CG) potential functions for use in implicit-solvent Brownian dynamics (BD) simulations of proteins derived from all-atom explicit-solvent molecular dynamics (MD) simulations of amino acids. Bonded potential functions were derived from 1 μs MD simulations of each of the 20 canonical amino acids, with histidine modeled in both its protonated and neutral forms; nonbonded potential functions were derived from 1 μs MD simulations of every possible pairing of the amino acids (231 different systems). The angle and dihedral probability distributions and radial distribution functions sampled during MD were used to optimize a set of CG potential functions through use of the iterative Boltzmann inversion (IBI) method. The optimized set of potential functions—which we term COFFDROP (COarse-grained Force Field for Dynamic Representation Of Proteins)—quantitatively reproduced all of the “target” MD distributions. In a first test of the force field, it was used to predict the clustering behavior of concentrated amino acid solutions; the predictions were directly compared with the results of corresponding all-atom explicit-solvent MD simulations and found to be in excellent agreement. In a second test, BD simulations of the small protein villin headpiece were carried out at concentrations that have recently been studied in all-atom explicit-solvent MD simulations by Petrov and Zagrovic (PLoS Comput. Biol.2014, 5, e1003638). The anomalously strong intermolecular interactions seen in the MD study were reproduced in the COFFDROP simulations; a simple scaling of COFFDROP’s nonbonded parameters, however, produced results in better accordance with experiment. Overall, our results suggest that potential functions derived from simulations of pairwise amino acid interactions might be of quite broad applicability, with COFFDROP likely to be especially useful for modeling unfolded or intrinsically

  7. Elastic properties of dynein motor domain obtained from all-atom molecular dynamics simulations

    PubMed Central

    Kamiya, Narutoshi; Mashimo, Tadaaki; Takano, Yu; Kon, Takahide; Kurisu, Genji; Nakamura, Haruki

    2016-01-01

    Dyneins are large microtubule motor proteins that convert ATP energy to mechanical power. High-resolution crystal structures of ADP-bound cytoplasmic dynein have revealed the organization of the motor domain, comprising the AAA+ ring, the linker, the stalk/strut and the C sequence. Recently, the ADP.vanadate-bound structure, which is similar to the ATP hydrolysis transition state, revealed how the structure of dynein changes upon ATP binding. Although both the ADP- and ATP-bound state structures have been resolved, the dynamic properties at the atomic level remain unclear. In this work, we built two models named ‘the ADP model’ and ‘the ATP model’, where ADP and ATP are bound to AAA1 in the AAA+ ring, respectively, to observe the initial procedure of the structural change from the unprimed to the primed state. We performed 200-ns molecular dynamics simulations for both models and compared their structures and dynamics. The motions of the stalk, consisting of a long coiled coil with a microtubule-binding domain, significantly differed between the two models. The elastic properties of the stalk were analyzed and compared with the experimental results. PMID:27334455

  8. Simplified protein models can rival all atom simulations in predicting folding pathways and structure

    PubMed Central

    Adhikari, Aashish N.; Freed, Karl F.; Sosnick, Tobin R.

    2014-01-01

    We demonstrate the ability of simultaneously determining a protein’s folding pathway and structure using a properly formulated model without prior knowledge of the native structure. Our model employs a natural coordinate system for describing proteins and a search strategy inspired by the observation that real proteins fold in a sequential fashion by incrementally stabilizing native-like substructures or "foldons". Comparable folding pathways and structures are obtained for the twelve proteins recently studied using atomistic molecular dynamics simulations [K. Lindorff-Larsen, S. Piana, R.O. Dror, D. E. Shaw, Science 334, 517 (2011)], with our calculations running several orders of magnitude faster. We find that native-like propensities in the unfolded state do not necessarily determine the order of structure formation, a departure from a major conclusion of the MD study. Instead, our results support a more expansive view wherein intrinsic local structural propensities may be enhanced or overridden in the folding process by environmental context. The success of our search strategy validates it as an expedient mechanism for folding both in silico and in vivo. PMID:23889448

  9. Elastic properties of dynein motor domain obtained from all-atom molecular dynamics simulations.

    PubMed

    Kamiya, Narutoshi; Mashimo, Tadaaki; Takano, Yu; Kon, Takahide; Kurisu, Genji; Nakamura, Haruki

    2016-08-01

    Dyneins are large microtubule motor proteins that convert ATP energy to mechanical power. High-resolution crystal structures of ADP-bound cytoplasmic dynein have revealed the organization of the motor domain, comprising the AAA(+) ring, the linker, the stalk/strut and the C sequence. Recently, the ADP.vanadate-bound structure, which is similar to the ATP hydrolysis transition state, revealed how the structure of dynein changes upon ATP binding. Although both the ADP- and ATP-bound state structures have been resolved, the dynamic properties at the atomic level remain unclear. In this work, we built two models named 'the ADP model' and 'the ATP model', where ADP and ATP are bound to AAA1 in the AAA(+) ring, respectively, to observe the initial procedure of the structural change from the unprimed to the primed state. We performed 200-ns molecular dynamics simulations for both models and compared their structures and dynamics. The motions of the stalk, consisting of a long coiled coil with a microtubule-binding domain, significantly differed between the two models. The elastic properties of the stalk were analyzed and compared with the experimental results. PMID:27334455

  10. Dynamic performance of duolayers at the air/water interface. 2. Mechanistic insights from all-atom simulations.

    PubMed

    Christofferson, Andrew J; Yiapanis, George; Leung, Andy H M; Prime, Emma L; Tran, Diana N H; Qiao, Greg G; Solomon, David H; Yarovsky, Irene

    2014-09-18

    The novel duolayer system, comprising a monolayer of ethylene glycol monooctadecyl ether (C18E1) and the water-soluble polymer poly(vinylpyrrolidone) (PVP), has been shown to resist forces such as wind stress to a greater degree than the C18E1 monolayer alone. This paper reports all-atom molecular dynamics simulations comparing the monolayer (C18E1 alone) and duolayer systems under an applied force parallel to the air/water interface. The simulations show that, due to the presence of PVP at the interface, the duolayer film exhibits an increase in chain tilt, ordering, and density, as well as a lower lateral velocity compared to the monolayer. These results provide a molecular rationale for the improved performance of the duolayer system under wind conditions, as well as an atomic-level explanation for the observed efficacy of the duolayer system as an evaporation suppressant, which may serve as a useful guide for future development for thin films where resistance to external perturbation is desirable.

  11. Insights into the Tunnel Mechanism of Cholesteryl Ester Transfer Protein through All-atom Molecular Dynamics Simulations.

    PubMed

    Lei, Dongsheng; Rames, Matthew; Zhang, Xing; Zhang, Lei; Zhang, Shengli; Ren, Gang

    2016-07-01

    Cholesteryl ester transfer protein (CETP) mediates cholesteryl ester (CE) transfer from the atheroprotective high density lipoprotein (HDL) cholesterol to the atherogenic low density lipoprotein cholesterol. In the past decade, this property has driven the development of CETP inhibitors, which have been evaluated in large scale clinical trials for treating cardiovascular diseases. Despite the pharmacological interest, little is known about the fundamental mechanism of CETP in CE transfer. Recent electron microscopy (EM) experiments have suggested a tunnel mechanism, and molecular dynamics simulations have shown that the flexible N-terminal distal end of CETP penetrates into the HDL surface and takes up a CE molecule through an open pore. However, it is not known whether a CE molecule can completely transfer through an entire CETP molecule. Here, we used all-atom molecular dynamics simulations to evaluate this possibility. The results showed that a hydrophobic tunnel inside CETP is sufficient to allow a CE molecule to completely transfer through the entire CETP within a predicted transfer time and at a rate comparable with those obtained through physiological measurements. Analyses of the detailed interactions revealed several residues that might be critical for CETP function, which may provide important clues for the effective development of CETP inhibitors and treatment of cardiovascular diseases. PMID:27143480

  12. Insights into the Tunnel Mechanism of Cholesteryl Ester Transfer Protein through All-atom Molecular Dynamics Simulations*

    PubMed Central

    Lei, Dongsheng; Rames, Matthew; Zhang, Xing; Zhang, Lei; Zhang, Shengli; Ren, Gang

    2016-01-01

    Cholesteryl ester transfer protein (CETP) mediates cholesteryl ester (CE) transfer from the atheroprotective high density lipoprotein (HDL) cholesterol to the atherogenic low density lipoprotein cholesterol. In the past decade, this property has driven the development of CETP inhibitors, which have been evaluated in large scale clinical trials for treating cardiovascular diseases. Despite the pharmacological interest, little is known about the fundamental mechanism of CETP in CE transfer. Recent electron microscopy (EM) experiments have suggested a tunnel mechanism, and molecular dynamics simulations have shown that the flexible N-terminal distal end of CETP penetrates into the HDL surface and takes up a CE molecule through an open pore. However, it is not known whether a CE molecule can completely transfer through an entire CETP molecule. Here, we used all-atom molecular dynamics simulations to evaluate this possibility. The results showed that a hydrophobic tunnel inside CETP is sufficient to allow a CE molecule to completely transfer through the entire CETP within a predicted transfer time and at a rate comparable with those obtained through physiological measurements. Analyses of the detailed interactions revealed several residues that might be critical for CETP function, which may provide important clues for the effective development of CETP inhibitors and treatment of cardiovascular diseases. PMID:27143480

  13. Insights into activation and RNA binding of trp RNA-binding attenuation protein (TRAP) through all-atom simulations.

    PubMed

    Murtola, Teemu; Vattulainen, Ilpo; Falck, Emma

    2008-06-01

    Tryptophan biosynthesis in Bacillus stearothermophilus is regulated by a trp RNA binding attenuation protein (TRAP). It is a ring-shaped 11-mer of identical 74 residue subunits. Tryptophan binding pockets are located between adjacent subunits, and tryptophan binding activates TRAP to bind RNA. Here, we report results from all-atom molecular dynamics simulations of the system, complementing existing extensive experimental studies. We focus on two questions. First, we look at the activation mechanism, of which relatively little is known experimentally. We find that the absence of tryptophan allows larger motions close to the tryptophan binding site, and we see indication of a conformational change in the BC loop. However, complete deactivation seems to occur on much longer time scales than the 40 ns studied here. Second, we study the TRAP-RNA interactions. We look at the relative flexibilities of the different bases in the complex and analyze the hydrogen bonds between the protein and RNA. We also study the role of Lys37, Lys56, and Arg58, which have been experimentally identified as essential for RNA binding. Hydrophobic stacking of Lys37 with the nearby RNA base is confirmed, but we do not see direct hydrogen bonding between RNA and the other two residues, in contrast to the crystal structure. Rather, these residues seem to stabilize the RNA-binding surface, and their positive charge may also play a role in RNA binding. Simulations also indicate that TRAP is able to attract RNA nonspecifically, and the interactions are quantified in more detail using binding energy calculations. The formation of the final binding complex is a very slow process: within the simulation time scale of 40 ns, only two guanine bases become bound (and no others), indicating that the binding initiates at these positions. In general, our results are in good agreement with experimental studies, and provide atomic-scale insights into the processes. PMID:18186477

  14. Probing the Huntingtin 1-17 Membrane Anchor on a Phospholipid Bilayer by Using All-Atom Simulations

    PubMed Central

    Côté, Sébastien; Binette, Vincent; Salnikov, Evgeniy S.; Bechinger, Burkhard; Mousseau, Normand

    2015-01-01

    Mislocalization and aggregation of the huntingtin protein are related to Huntington’s disease. Its first exon—more specifically the first 17 amino acids (Htt17)—is crucial for the physiological and pathological functions of huntingtin. It regulates huntingtin’s activity through posttranslational modifications and serves as an anchor to membrane-containing organelles of the cell. Recently, structure and orientation of the Htt17 membrane anchor were determined using a combined solution and solid-state NMR approach. This prompted us to refine this model by investigating the dynamics and thermodynamics of this membrane anchor on a POPC bilayer using all-atom, explicit solvent molecular dynamics and Hamiltonian replica exchange. Our simulations are combined with various experimental measurements to generate a high-resolution atomistic model for the huntingtin Htt17 membrane anchor on a POPC bilayer. More precisely, we observe that the single α-helix structure is more stable in the phospholipid membrane than the NMR model obtained in the presence of dodecylphosphocholine detergent micelles. The resulting Htt17 monomer has its hydrophobic plane oriented parallel to the bilayer surface. Our results further unveil the key residues interacting with the membrane in terms of hydrogen bonds, salt-bridges, and nonpolar contributions. We also observe that Htt17 equilibrates at a well-defined insertion depth and that it perturbs the physical properties—order parameter, thickness, and area per lipid—of the bilayer in a manner that could favor its dimerization. Overall, our observations reinforce and refine the NMR measurements on the Htt17 membrane anchor segment of huntingtin that is of fundamental importance to its biological functions. PMID:25762330

  15. A coarse-graining approach for molecular simulation that retains the dynamics of the all-atom reference system by implementing hydrodynamic interactions

    SciTech Connect

    Markutsya, Sergiy; Lamm, Monica H

    2014-11-07

    We report on a new approach for deriving coarse-grained intermolecular forces that retains the frictional contribution that is often discarded by conventional coarse-graining methods. The approach is tested for water and an aqueous glucose solution, and the results from the new implementation for coarse-grained molecular dynamics simulation show remarkable agreement with the dynamics obtained from reference all-atom simulations. The agreement between the structural properties observed in the coarse-grained and all-atom simulations is also preserved. We discuss how this approach may be applied broadly to any existing coarse-graining method where the coarse-grained models are rigorously derived from all-atom reference systems.

  16. Variational Optimization of an All-Atom Implicit Solvent Force Field to Match Explicit Solvent Simulation Data.

    PubMed

    Bottaro, Sandro; Lindorff-Larsen, Kresten; Best, Robert B

    2013-12-10

    The development of accurate implicit solvation models with low computational cost is essential for addressing many large-scale biophysical problems. Here, we present an efficient solvation term based on a Gaussian solvent-exclusion model (EEF1) for simulations of proteins in aqueous environment, with the primary aim of having a good overlap with explicit solvent simulations, particularly for unfolded and disordered states - as would be needed for multiscale applications. In order to achieve this, we have used a recently proposed coarse-graining procedure based on minimization of an entropy-related objective function to train the model to reproduce the equilibrium distribution obtained from explicit water simulations. Via this methodology, we have optimized both a charge screening parameter and a backbone torsion term against explicit solvent simulations of an α-helical and a β-stranded peptide. The performance of the resulting effective energy function, termed EEF1-SB, is tested with respect to the properties of folded proteins, the folding of small peptides or fast-folding proteins, and NMR data for intrinsically disordered proteins. The results show that EEF1-SB provides a reasonable description of a wide range of systems, but its key advantage over other methods tested is that it captures very well the structure and dimension of disordered or weakly structured peptides. EEF1-SB is thus a computationally inexpensive (~ 10 times faster than Generalized-Born methods) and transferable approximation for treating solvent effects. PMID:24748852

  17. Energetics of nonpolar and polar compounds in cationic, anionic, and nonionic micelles studied by all-atom molecular dynamics simulation combined with a theory of solutions.

    PubMed

    Date, Atsushi; Ishizuka, Ryosuke; Matubayasi, Nobuyuki

    2016-05-21

    Energetic analysis was conducted for nonpolar and polar solutes bound in a cationic micelle of dodecyl trimethyl ammonium bromide (DTAB), an anionic micelle of sodium dodecyl sulfate (SDS), and a nonionic micelle of tetraethylene glycol monododecyl ether (Brij30). All-atom molecular dynamics simulation was performed, and the free energies of binding the solutes in the hydrophobic-core and headgroup regions of the micelles were computed using the energy-representation method. It was found in all the micelles examined that aromatic naphthalene is preferably located more outward than aliphatic propane and that the polar solutes are localized at the interface of the hydrophobic and hydrophilic regions. The roles of the surfactant and water were then elucidated by decomposing the free energy into the contributions from the respective species. Water was observed to play a decisive role in determining the binding location of the solute, while the surfactant was found to be more important for the overall stabilization of the solute within the micelle. The effects of attractive and repulsive interactions of the solute with the surfactant and water were further examined, and their competition was analyzed in connection with the preferable location of the solute in the micellar system.

  18. Spontaneous conformational changes in the E. coli GroEL subunit from all-atom molecular dynamics simulations.

    PubMed

    Sliozberg, Yelena; Abrams, Cameron F

    2007-09-15

    The Escherichia coli chaperonin GroEL is a complex of identical subunit proteins (57 kDa each) arranged in a back-to-back stacking of two heptameric rings. Its hallmarks include nested positive intra-ring and negative inter-ring cooperativity in adenosine trisphosphate (ATP) binding and the ability to mediate the folding of newly transcribed and/or denatured substrate proteins. We performed unbiased molecular dynamics simulations of the GroEL subunit protein in explicit water both with and without the nucleotide KMgATP to understand better the details of the structural transitions that enable these behaviors. Placing KMgATP in the equatorial domain binding pocket of a t state subunit, which corresponds to a low ATP-affinity state, produced a short-lived (6 ns) state that spontaneously transitioned to the high ATP-affinity r state. The important feature of this transition is a large-scale rotation of the intermediate domain's helix M to close the ATP binding pocket. Pivoting of helix M is accompanied by counterclockwise rotation and slight deformation of the apical domain, important for lowering the affinity for substrate protein. Aligning simulation conformations into model heptamer rings demonstrates that the t-->r transition in one subunit is not sterically hindered by t state neighbors, but requires breakage of Arg(197)-Glu(386) intersubunit salt bridges, which are important for inter-ring positive cooperativity. Lowest-frequency quasi-harmonic modes of vibration computed pre- and post-transition clearly show that natural vibrations facilitate the transition. Finally, we propose a novel mechanism for inter-ring cooperativity in ATP binding inspired by the observation of spontaneous insertion of the side chain of Ala(480) into the empty nucleotide pocket. PMID:17513353

  19. Investigating a link between all-atom model simulation and the Ising-based theory on the helix-coil transition. II. Nonstationary properties

    NASA Astrophysics Data System (ADS)

    Takano, Mitsunori; Nakamura, Hironori K.; Nagayama, Kuniaki; Suyama, Akira

    2003-06-01

    The all-atom and the Ising-based models have both played their own roles to help our understanding of helix-coil transition. In this study, we address to what degree these two theoretical models can be consistent with each other in the nonstationary regime, complementing the preceding equilibrium study. We conducted molecular dynamics simulations of an all-atom model polyalanine chain and Monte Carlo simulations of a corresponding kinetic Ising chain. Nonstationary properties of each model were characterized through power spectrum, Allan variance, and autocorrelation analyses regarding the time course of a system order parameter. A clear difference was indicated between the two models: the Ising-based model showed a Lorentzian spectrum in the frequency domain and a single exponential form in the time domain, whereas the all-atom model showed a 1/f spectrum and a stretched exponential form. The observed stretched exponential form is in agreement with a very recent T-jump experiment. The effect of viscous damping on helix-coil dynamics was also studied. A possible source of the observed difference between the two models is discussed by considering the potential energy landscape, and the idea of dynamical disorder was introduced into the original Glauber model in the hope of bridging the gap between the two models. Other possible sources, e.g., the limitations of the Ising framework and the validity of the Markovian dynamics assumption, are also discussed.

  20. Probing the folded state and mechanical unfolding pathways of T4 lysozyme using all-atom and coarse-grained molecular simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Wenjun; Glenn, Paul

    2015-01-01

    The Bacteriophage T4 Lysozyme (T4L) is a prototype modular protein comprised of an N-terminal and a C-domain domain, which was extensively studied to understand the folding/unfolding mechanism of modular proteins. To offer detailed structural and dynamic insights to the folded-state stability and the mechanical unfolding behaviors of T4L, we have performed extensive equilibrium and steered molecular dynamics simulations of both the wild-type (WT) and a circular permutation (CP) variant of T4L using all-atom and coarse-grained force fields. Our all-atom and coarse-grained simulations of the folded state have consistently found greater stability of the C-domain than the N-domain in isolation, which is in agreement with past thermostatic studies of T4L. While the all-atom simulation cannot fully explain the mechanical unfolding behaviors of the WT and the CP variant observed in an optical tweezers study, the coarse-grained simulations based on the Go model or a modified elastic network model (mENM) are in qualitative agreement with the experimental finding of greater unfolding cooperativity in the WT than the CP variant. Interestingly, the two coarse-grained models predict different structural mechanisms for the observed change in cooperativity between the WT and the CP variant—while the Go model predicts minor modification of the unfolding pathways by circular permutation (i.e., preserving the general order that the N-domain unfolds before the C-domain), the mENM predicts a dramatic change in unfolding pathways (e.g., different order of N/C-domain unfolding in the WT and the CP variant). Based on our simulations, we have analyzed the limitations of and the key differences between these models and offered testable predictions for future experiments to resolve the structural mechanism for cooperative folding/unfolding of T4L.

  1. Probing the folded state and mechanical unfolding pathways of T4 lysozyme using all-atom and coarse-grained molecular simulation

    SciTech Connect

    Zheng, Wenjun Glenn, Paul

    2015-01-21

    The Bacteriophage T4 Lysozyme (T4L) is a prototype modular protein comprised of an N-terminal and a C-domain domain, which was extensively studied to understand the folding/unfolding mechanism of modular proteins. To offer detailed structural and dynamic insights to the folded-state stability and the mechanical unfolding behaviors of T4L, we have performed extensive equilibrium and steered molecular dynamics simulations of both the wild-type (WT) and a circular permutation (CP) variant of T4L using all-atom and coarse-grained force fields. Our all-atom and coarse-grained simulations of the folded state have consistently found greater stability of the C-domain than the N-domain in isolation, which is in agreement with past thermostatic studies of T4L. While the all-atom simulation cannot fully explain the mechanical unfolding behaviors of the WT and the CP variant observed in an optical tweezers study, the coarse-grained simulations based on the Go model or a modified elastic network model (mENM) are in qualitative agreement with the experimental finding of greater unfolding cooperativity in the WT than the CP variant. Interestingly, the two coarse-grained models predict different structural mechanisms for the observed change in cooperativity between the WT and the CP variant—while the Go model predicts minor modification of the unfolding pathways by circular permutation (i.e., preserving the general order that the N-domain unfolds before the C-domain), the mENM predicts a dramatic change in unfolding pathways (e.g., different order of N/C-domain unfolding in the WT and the CP variant). Based on our simulations, we have analyzed the limitations of and the key differences between these models and offered testable predictions for future experiments to resolve the structural mechanism for cooperative folding/unfolding of T4L.

  2. Free energetics of carbon nanotube association in aqueous inorganic NaI salt solutions: Temperature effects using all-atom molecular dynamics simulations.

    PubMed

    Ou, Shu-Ching; Cui, Di; Wezowicz, Matthew; Taufer, Michela; Patel, Sandeep

    2015-06-15

    In this study, we examine the temperature dependence of free energetics of nanotube association using graphical processing unit-enabled all-atom molecular dynamics simulations (FEN ZI) with two (10,10) single-walled carbon nanotubes in 3 m NaI aqueous salt solution. Results suggest that the free energy, enthalpy and entropy changes for the association process are all reduced at the high temperature, in agreement with previous investigations using other hydrophobes. Via the decomposition of free energy into individual components, we found that solvent contribution (including water, anion, and cation contributions) is correlated with the spatial distribution of the corresponding species and is influenced distinctly by the temperature. We studied the spatial distribution and the structure of the solvent in different regions: intertube, intratube and the bulk solvent. By calculating the fluctuation of coarse-grained tube-solvent surfaces, we found that tube-water interfacial fluctuation exhibits the strongest temperature dependence. By taking ions to be a solvent-like medium in the absence of water, tube-anion interfacial fluctuation shows similar but weaker dependence on temperature, while tube-cation interfacial fluctuation shows no dependence in general. These characteristics are discussed via the malleability of their corresponding solvation shells relative to the nanotube surface. Hydrogen bonding profiles and tetrahedrality of water arrangement are also computed to compare the structure of solvent in the solvent bulk and intertube region. The hydrophobic confinement induces a relatively lower concentration environment in the intertube region, therefore causing different intertube solvent structures which depend on the tube separation. This study is relevant in the continuing discourse on hydrophobic interactions (as they impact generally a broad class of phenomena in biology, biochemistry, and materials science and soft condensed matter research), and

  3. Free energetics of carbon nanotube association in aqueous inorganic NaI salt solutions: Temperature effects using all-atom molecular dynamics simulations.

    PubMed

    Ou, Shu-Ching; Cui, Di; Wezowicz, Matthew; Taufer, Michela; Patel, Sandeep

    2015-06-15

    In this study, we examine the temperature dependence of free energetics of nanotube association using graphical processing unit-enabled all-atom molecular dynamics simulations (FEN ZI) with two (10,10) single-walled carbon nanotubes in 3 m NaI aqueous salt solution. Results suggest that the free energy, enthalpy and entropy changes for the association process are all reduced at the high temperature, in agreement with previous investigations using other hydrophobes. Via the decomposition of free energy into individual components, we found that solvent contribution (including water, anion, and cation contributions) is correlated with the spatial distribution of the corresponding species and is influenced distinctly by the temperature. We studied the spatial distribution and the structure of the solvent in different regions: intertube, intratube and the bulk solvent. By calculating the fluctuation of coarse-grained tube-solvent surfaces, we found that tube-water interfacial fluctuation exhibits the strongest temperature dependence. By taking ions to be a solvent-like medium in the absence of water, tube-anion interfacial fluctuation shows similar but weaker dependence on temperature, while tube-cation interfacial fluctuation shows no dependence in general. These characteristics are discussed via the malleability of their corresponding solvation shells relative to the nanotube surface. Hydrogen bonding profiles and tetrahedrality of water arrangement are also computed to compare the structure of solvent in the solvent bulk and intertube region. The hydrophobic confinement induces a relatively lower concentration environment in the intertube region, therefore causing different intertube solvent structures which depend on the tube separation. This study is relevant in the continuing discourse on hydrophobic interactions (as they impact generally a broad class of phenomena in biology, biochemistry, and materials science and soft condensed matter research), and

  4. Free Energetics of Carbon Nanotube Association in Aqueous Inorganic NaI Salt Solutions: Temperature Effects using All-Atom Molecular Dynamics Simulations

    PubMed Central

    Ou, Shu-Ching; Cui, Di; Wezowicz, Matthew; Taufer, Michela; Patel, Sandeep

    2015-01-01

    In this study we examine the temperature dependence of free energetics of nanotube association by using GPU-enabled all-atom molecular dynamics simulations (FEN ZI) with two (10,10) single-walled carbon nanotubes in 3 m NaI aqueous salt solution. Results suggest that the free energy, enthalpy and entropy changes for the association process are all reduced at the high temperature, in agreement with previous investigations using other hydrophobes. Via the decomposition of free energy into individual components, we found that solvent contribution (including water, anion and cation contributions) is correlated with the spatial distribution of the corresponding species and is influenced distinctly by the temperature. We studied the spatial distribution and the structure of the solvent in different regions: intertube, intra-tube and the bulk solvent. By calculating the fluctuation of coarse-grained tube-solvent surfaces, we found that tube-water interfacial fluctuation exhibits the strongest temperature dependence. By taking ions to be a solvent-like medium in the absence of water, tube-anion interfacial fluctuation also shows similar but weaker dependence on temperature, while tube-cation interfacial fluctuation shows no dependence in general. These characteristics are discussed via the malleability of their corresponding solvation shells relative to the nanotube surface. Hydrogen bonding profiles and tetrahedrality of water arrangement are also computed to compare the structure of solvent in the solvent bulk and intertube region. The hydrophobic confinement induces a relatively lower concentration environment in the intertube region, therefore causing different intertube solvent structures which depend on the tube separation. This study is relevant in the continuing discourse on hydrophobic interactions (as they impact generally a broad class of phenomena in biology, biochemistry, and materials science and soft condensed matter research), and interpretations of

  5. Insight into the Properties of Cardiolipin Containing Bilayers from Molecular Dynamics Simulations, Using a Hybrid All-Atom/United-Atom Force Field.

    PubMed

    Aguayo, Daniel; González-Nilo, Fernando D; Chipot, Christophe

    2012-05-01

    Simulation of three models of cardiolipin (CL) containing membranes using a new set of parameters for tetramyristoyl and tetraoleoyl CLs has been developed in the framework of the united-atom CHARMM27-UA and the all-atom CHARMM36 force fields with the aim of performing molecular dynamics (MD) simulations of cardiolipin-containing mixed-lipid membranes. The new parameters use a hybrid representation of all-atom head groups in conjunction with implicit-hydrogen united-atom (UA) to describe the oleoyl and myristoyl chains of the CLs, in lieu of the fully atomistic description, thereby allowing longer simulations to be undertaken. The physicochemical properties of the bilayers were determined and compared with previously reported data. Furthermore, using tetramyristoyl CL mixed with POPG and POPE lipids, a mitochondrial membrane was simulated. The results presented here show the different behavior of the bilayers as a result of the lipid composition, where the length of the acyl chain and the conformation of the headgroup can be associated with the mitochondrial membrane properties. The new hybrid CL parameters prove to be well suited for the simulation of the molecular structure of CL-containing bilayers and can be extended to other lipid bilayers composed of CLs with different acyl chains or alternate head groups.

  6. Insight into the Properties of Cardiolipin Containing Bilayers from Molecular Dynamics Simulations, Using a Hybrid All-Atom/United-Atom Force Field.

    PubMed

    Aguayo, Daniel; González-Nilo, Fernando D; Chipot, Christophe

    2012-05-01

    Simulation of three models of cardiolipin (CL) containing membranes using a new set of parameters for tetramyristoyl and tetraoleoyl CLs has been developed in the framework of the united-atom CHARMM27-UA and the all-atom CHARMM36 force fields with the aim of performing molecular dynamics (MD) simulations of cardiolipin-containing mixed-lipid membranes. The new parameters use a hybrid representation of all-atom head groups in conjunction with implicit-hydrogen united-atom (UA) to describe the oleoyl and myristoyl chains of the CLs, in lieu of the fully atomistic description, thereby allowing longer simulations to be undertaken. The physicochemical properties of the bilayers were determined and compared with previously reported data. Furthermore, using tetramyristoyl CL mixed with POPG and POPE lipids, a mitochondrial membrane was simulated. The results presented here show the different behavior of the bilayers as a result of the lipid composition, where the length of the acyl chain and the conformation of the headgroup can be associated with the mitochondrial membrane properties. The new hybrid CL parameters prove to be well suited for the simulation of the molecular structure of CL-containing bilayers and can be extended to other lipid bilayers composed of CLs with different acyl chains or alternate head groups. PMID:26593668

  7. Evaluation of protein-protein docking model structures using all-atom molecular dynamics simulations combined with the solution theory in the energy representation

    NASA Astrophysics Data System (ADS)

    Takemura, Kazuhiro; Guo, Hao; Sakuraba, Shun; Matubayasi, Nobuyuki; Kitao, Akio

    2012-12-01

    We propose a method to evaluate binding free energy differences among distinct protein-protein complex model structures through all-atom molecular dynamics simulations in explicit water using the solution theory in the energy representation. Complex model structures are generated from a pair of monomeric structures using the rigid-body docking program ZDOCK. After structure refinement by side chain optimization and all-atom molecular dynamics simulations in explicit water, complex models are evaluated based on the sum of their conformational and solvation free energies, the latter calculated from the energy distribution functions obtained from relatively short molecular dynamics simulations of the complex in water and of pure water based on the solution theory in the energy representation. We examined protein-protein complex model structures of two protein-protein complex systems, bovine trypsin/CMTI-1 squash inhibitor (PDB ID: 1PPE) and RNase SA/barstar (PDB ID: 1AY7), for which both complex and monomer structures were determined experimentally. For each system, we calculated the energies for the crystal complex structure and twelve generated model structures including the model most similar to the crystal structure and very different from it. In both systems, the sum of the conformational and solvation free energies tended to be lower for the structure similar to the crystal. We concluded that our energy calculation method is useful for selecting low energy complex models similar to the crystal structure from among a set of generated models.

  8. Conformational landscape of the HIV-V3 hairpin loop from all-atom free-energy simulations

    NASA Astrophysics Data System (ADS)

    Verma, Abhinav; Wenzel, Wolfgang

    2008-03-01

    Small beta hairpins have many distinct biological functions, including their involvement in chemokine and viral receptor recognition. The relevance of structural similarities between different hairpin loops with near homologous sequences is not yet understood, calling for the development of methods for de novo hairpin structure prediction and simulation. De novo folding of beta strands is more difficult than that of helical proteins because of nonlocal hydrogen bonding patterns that connect amino acids that are distant in the amino acid sequence and there is a large variety of possible hydrogen bond patterns. Here we use a greedy version of the basin hopping technique with our free-energy forcefield PFF02 to reproducibly and predictively fold the hairpin structure of a HIV-V3 loop. We performed 20 independent basin hopping runs for 500cycles corresponding to 7.4×107 energy evaluations each. The lowest energy structure found in the simulation has a backbone root mean square deviation (bRMSD) of only 2.04Å to the native conformation. The lowest 9 out of the 20 simulations converged to conformations deviating less than 2.5Å bRMSD from native.

  9. Density relaxation and particle motion characteristics in a non-ionic deep eutectic solvent (acetamide + urea): time-resolved fluorescence measurements and all-atom molecular dynamics simulations.

    PubMed

    Das, Anuradha; Das, Suman; Biswas, Ranjit

    2015-01-21

    Temperature dependent relaxation dynamics, particle motion characteristics, and heterogeneity aspects of deep eutectic solvents (DESs) made of acetamide (CH3CONH2) and urea (NH2CONH2) have been investigated by employing time-resolved fluorescence measurements and all-atom molecular dynamics simulations. Three different compositions (f) for the mixture [fCH3CONH2 + (1 - f)NH2CONH2] have been studied in a temperature range of 328-353 K which is ∼120-145 K above the measured glass transition temperatures (∼207 K) of these DESs but much lower than the individual melting temperature of either of the constituents. Steady state fluorescence emission measurements using probe solutes with sharply different lifetimes do not indicate any dependence on excitation wavelength in these metastable molten systems. Time-resolved fluorescence anisotropy measurements reveal near-hydrodynamic coupling between medium viscosity and rotation of a dissolved dipolar solute. Stokes shift dynamics have been found to be too fast to be detected by the time-resolution (∼70 ps) employed, suggesting extremely rapid medium polarization relaxation. All-atom simulations reveal Gaussian distribution for particle displacements and van Hove correlations, and significant overlap between non-Gaussian (α2) and new non-Gaussian (γ) heterogeneity parameters. In addition, no stretched exponential relaxations have been detected in the simulated wavenumber dependent acetamide dynamic structure factors. All these results are in sharp contrast to earlier observations for ionic deep eutectics with acetamide [Guchhait et al., J. Chem. Phys. 140, 104514 (2014)] and suggest a fundamental difference in interaction and dynamics between ionic and non-ionic deep eutectic solvent systems.

  10. Density relaxation and particle motion characteristics in a non-ionic deep eutectic solvent (acetamide + urea): Time-resolved fluorescence measurements and all-atom molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Das, Anuradha; Das, Suman; Biswas, Ranjit

    2015-01-01

    Temperature dependent relaxation dynamics, particle motion characteristics, and heterogeneity aspects of deep eutectic solvents (DESs) made of acetamide (CH3CONH2) and urea (NH2CONH2) have been investigated by employing time-resolved fluorescence measurements and all-atom molecular dynamics simulations. Three different compositions (f) for the mixture [fCH3CONH2 + (1 - f)NH2CONH2] have been studied in a temperature range of 328-353 K which is ˜120-145 K above the measured glass transition temperatures (˜207 K) of these DESs but much lower than the individual melting temperature of either of the constituents. Steady state fluorescence emission measurements using probe solutes with sharply different lifetimes do not indicate any dependence on excitation wavelength in these metastable molten systems. Time-resolved fluorescence anisotropy measurements reveal near-hydrodynamic coupling between medium viscosity and rotation of a dissolved dipolar solute. Stokes shift dynamics have been found to be too fast to be detected by the time-resolution (˜70 ps) employed, suggesting extremely rapid medium polarization relaxation. All-atom simulations reveal Gaussian distribution for particle displacements and van Hove correlations, and significant overlap between non-Gaussian (α2) and new non-Gaussian (γ) heterogeneity parameters. In addition, no stretched exponential relaxations have been detected in the simulated wavenumber dependent acetamide dynamic structure factors. All these results are in sharp contrast to earlier observations for ionic deep eutectics with acetamide [Guchhait et al., J. Chem. Phys. 140, 104514 (2014)] and suggest a fundamental difference in interaction and dynamics between ionic and non-ionic deep eutectic solvent systems.

  11. Density relaxation and particle motion characteristics in a non-ionic deep eutectic solvent (acetamide + urea): Time-resolved fluorescence measurements and all-atom molecular dynamics simulations

    SciTech Connect

    Das, Anuradha; Das, Suman; Biswas, Ranjit

    2015-01-21

    Temperature dependent relaxation dynamics, particle motion characteristics, and heterogeneity aspects of deep eutectic solvents (DESs) made of acetamide (CH{sub 3}CONH{sub 2}) and urea (NH{sub 2}CONH{sub 2}) have been investigated by employing time-resolved fluorescence measurements and all-atom molecular dynamics simulations. Three different compositions (f) for the mixture [fCH{sub 3}CONH{sub 2} + (1 − f)NH{sub 2}CONH{sub 2}] have been studied in a temperature range of 328-353 K which is ∼120-145 K above the measured glass transition temperatures (∼207 K) of these DESs but much lower than the individual melting temperature of either of the constituents. Steady state fluorescence emission measurements using probe solutes with sharply different lifetimes do not indicate any dependence on excitation wavelength in these metastable molten systems. Time-resolved fluorescence anisotropy measurements reveal near-hydrodynamic coupling between medium viscosity and rotation of a dissolved dipolar solute. Stokes shift dynamics have been found to be too fast to be detected by the time-resolution (∼70 ps) employed, suggesting extremely rapid medium polarization relaxation. All-atom simulations reveal Gaussian distribution for particle displacements and van Hove correlations, and significant overlap between non-Gaussian (α{sub 2}) and new non-Gaussian (γ) heterogeneity parameters. In addition, no stretched exponential relaxations have been detected in the simulated wavenumber dependent acetamide dynamic structure factors. All these results are in sharp contrast to earlier observations for ionic deep eutectics with acetamide [Guchhait et al., J. Chem. Phys. 140, 104514 (2014)] and suggest a fundamental difference in interaction and dynamics between ionic and non-ionic deep eutectic solvent systems.

  12. Density relaxation and particle motion characteristics in a non-ionic deep eutectic solvent (acetamide + urea): time-resolved fluorescence measurements and all-atom molecular dynamics simulations.

    PubMed

    Das, Anuradha; Das, Suman; Biswas, Ranjit

    2015-01-21

    Temperature dependent relaxation dynamics, particle motion characteristics, and heterogeneity aspects of deep eutectic solvents (DESs) made of acetamide (CH3CONH2) and urea (NH2CONH2) have been investigated by employing time-resolved fluorescence measurements and all-atom molecular dynamics simulations. Three different compositions (f) for the mixture [fCH3CONH2 + (1 - f)NH2CONH2] have been studied in a temperature range of 328-353 K which is ∼120-145 K above the measured glass transition temperatures (∼207 K) of these DESs but much lower than the individual melting temperature of either of the constituents. Steady state fluorescence emission measurements using probe solutes with sharply different lifetimes do not indicate any dependence on excitation wavelength in these metastable molten systems. Time-resolved fluorescence anisotropy measurements reveal near-hydrodynamic coupling between medium viscosity and rotation of a dissolved dipolar solute. Stokes shift dynamics have been found to be too fast to be detected by the time-resolution (∼70 ps) employed, suggesting extremely rapid medium polarization relaxation. All-atom simulations reveal Gaussian distribution for particle displacements and van Hove correlations, and significant overlap between non-Gaussian (α2) and new non-Gaussian (γ) heterogeneity parameters. In addition, no stretched exponential relaxations have been detected in the simulated wavenumber dependent acetamide dynamic structure factors. All these results are in sharp contrast to earlier observations for ionic deep eutectics with acetamide [Guchhait et al., J. Chem. Phys. 140, 104514 (2014)] and suggest a fundamental difference in interaction and dynamics between ionic and non-ionic deep eutectic solvent systems. PMID:25612718

  13. Relationship between population of the fibril-prone conformation in the monomeric state and oligomer formation times of peptides: Insights from all-atom simulations

    NASA Astrophysics Data System (ADS)

    Nam, Hoang Bao; Kouza, Maksim; Zung, Hoang; Li, Mai Suan

    2010-04-01

    Despite much progress in understanding the aggregation process of biomolecules, the factors that govern its rates have not been fully understood. This problem is of particular importance since many conformational diseases such as Alzheimer, Parkinson, and type-II diabetes are associated with the protein oligomerization. Having performed all-atom simulations with explicit water and various force fields for two short peptides KFFE and NNQQ, we show that their oligomer formation times are strongly correlated with the population of the fibril-prone conformation in the monomeric state. The larger the population the faster the aggregation process. Our result not only suggests that this quantity plays a key role in the self-assembly of polypeptide chains but also opens a new way to understand the fibrillogenesis of biomolecules at the monomeric level. The nature of oligomer ordering of NNQQ is studied in detail.

  14. All-atom molecular dynamics simulations of actin-myosin interactions: a comparative study of cardiac α myosin, β myosin, and fast skeletal muscle myosin.

    PubMed

    Li, Minghui; Zheng, Wenjun

    2013-11-26

    Myosins are a superfamily of actin-binding motor proteins with significant variations in kinetic properties (such as actin binding affinity) between different isoforms. It remains unknown how such kinetic variations arise from the structural and dynamic tuning of the actin-myosin interface at the amino acid residue level. To address this key issue, we have employed molecular modeling and simulations to investigate, with atomistic details, the isoform dependence of actin-myosin interactions in the rigor state. By combining electron microscopy-based docking with homology modeling, we have constructed three all-atom models for human cardiac α and β and rabbit fast skeletal muscle myosin in complex with three actin subunits in the rigor state. Starting from these models, we have performed extensive all-atom molecular dynamics (MD) simulations (total of 100 ns per system) and then used the MD trajectories to calculate actin-myosin binding free energies with contributions from both electrostatic and nonpolar forces. Our binding calculations are in good agreement with the experimental finding of isoform-dependent differences in actin binding affinity between these myosin isoforms. Such differences are traced to changes in actin-myosin electrostatic interactions (i.e., hydrogen bonds and salt bridges) that are highly dynamic and involve several flexible actin-binding loops. By partitioning the actin-myosin binding free energy to individual myosin residues, we have also identified key myosin residues involved in the actin-myosin interactions, some of which were previously validated experimentally or implicated in cardiomyopathy mutations, and the rest make promising targets for future mutational experiments. PMID:24224850

  15. All-atom simulations and free-energy calculations of coiled-coil peptides with lipid bilayers: binding strength, structural transition, and effect on lipid dynamics

    PubMed Central

    Woo, Sun Young; Lee, Hwankyu

    2016-01-01

    Peptides E and K, which are synthetic coiled-coil peptides for membrane fusion, were simulated with lipid bilayers composed of lipids and cholesterols at different ratios using all-atom models. We first calculated free energies of binding from umbrella sampling simulations, showing that both E and K peptides tend to adsorb onto the bilayer surface, which occurs more strongly in the bilayer composed of smaller lipid headgroups. Then, unrestrained simulations show that K peptides more deeply insert into the bilayer with partially retaining the helical structure, while E peptides less insert and predominantly become random coils, indicating the structural transition from helices to random coils, in quantitative agreement with experiments. This is because K peptides electrostatically interact with lipid phosphates, as well as because hydrocarbons of lysines of K peptide are longer than those of glutamic acids of E peptide and thus form stronger hydrophobic interactions with lipid tails. This deeper insertion of K peptide increases the bilayer dynamics and a vacancy below the peptide, leading to the rearrangement of smaller lipids. These findings help explain the experimentally observed or proposed differences in the insertion depth, binding strength, and structural transition of E and K peptides, and support the snorkeling effect. PMID:26926570

  16. Effect of water on structure and dynamics of [BMIM][PF6] ionic liquid: An all-atom molecular dynamics simulation investigation.

    PubMed

    Sharma, Anirban; Ghorai, Pradip Kr

    2016-03-21

    Composition dependent structural and dynamical properties of aqueous hydrophobic 1-butyl-3-methylimidazolium hexafluorophosphate ([BMIM][PF6]) ionic liquid (IL) have been investigated by using all-atom molecular dynamics simulation. We observe that addition of water does not increase significant number of dissociated ions in the solution over the pure state. As a consequence, self-diffusion coefficient of the cation and anion is comparable to each other at all water concentration similar to that is observed for the pure state. Voronoi polyhedra analysis exhibits strong dependence on the local environment of IL concentration. Void and neck distributions in Voronoi tessellation are approximately Gaussian for pure IL but upon subsequent addition of water, we observe deviation from the Gaussian behaviour with an asymmetric broadening with long tail of exponential decay at large void radius, particularly at higher water concentrations. The increase in void space and neck size at higher water concentration facilitates ionic motion, thus, decreasing dynamical heterogeneity and IL reorientation time and increases self-diffusion coefficient significantly.

  17. Effect of water on structure and dynamics of [BMIM][PF6] ionic liquid: An all-atom molecular dynamics simulation investigation

    NASA Astrophysics Data System (ADS)

    Sharma, Anirban; Ghorai, Pradip Kr.

    2016-03-01

    Composition dependent structural and dynamical properties of aqueous hydrophobic 1-butyl-3-methylimidazolium hexafluorophosphate ([BMIM][PF6]) ionic liquid (IL) have been investigated by using all-atom molecular dynamics simulation. We observe that addition of water does not increase significant number of dissociated ions in the solution over the pure state. As a consequence, self-diffusion coefficient of the cation and anion is comparable to each other at all water concentration similar to that is observed for the pure state. Voronoi polyhedra analysis exhibits strong dependence on the local environment of IL concentration. Void and neck distributions in Voronoi tessellation are approximately Gaussian for pure IL but upon subsequent addition of water, we observe deviation from the Gaussian behaviour with an asymmetric broadening with long tail of exponential decay at large void radius, particularly at higher water concentrations. The increase in void space and neck size at higher water concentration facilitates ionic motion, thus, decreasing dynamical heterogeneity and IL reorientation time and increases self-diffusion coefficient significantly.

  18. Effects of Water Models on Binding Affinity: Evidence from All-Atom Simulation of Binding of Tamiflu to A/H5N1 Neuraminidase

    PubMed Central

    Nguyen, Trang Truc; Viet, Man Hoang

    2014-01-01

    The influence of water models SPC, SPC/E, TIP3P, and TIP4P on ligand binding affinity is examined by calculating the binding free energy ΔGbind of oseltamivir carboxylate (Tamiflu) to the wild type of glycoprotein neuraminidase from the pandemic A/H5N1 virus. ΔGbind is estimated by the Molecular Mechanic-Poisson Boltzmann Surface Area method and all-atom simulations with different combinations of these aqueous models and four force fields AMBER99SB, CHARMM27, GROMOS96 43a1, and OPLS-AA/L. It is shown that there is no correlation between the binding free energy and the water density in the binding pocket in CHARMM. However, for three remaining force fields ΔGbind decays with increase of water density. SPC/E provides the lowest binding free energy for any force field, while the water effect is the most pronounced in CHARMM. In agreement with the popular GROMACS recommendation, the binding score obtained by combinations of AMBER-TIP3P, OPLS-TIP4P, and GROMOS-SPC is the most relevant to the experiments. For wild-type neuraminidase we have found that SPC is more suitable for CHARMM than TIP3P recommended by GROMACS for studying ligand binding. However, our study for three of its mutants reveals that TIP3P is presumably the best choice for CHARMM. PMID:24672329

  19. Putative membrane lytic sites of P-type and S-type cardiotoxins from snake venoms as probed by all-atom molecular dynamics simulations.

    PubMed

    Gorai, Biswajit; Karthikeyan, Muthusamy; Sivaraman, Thirunavukkarasu

    2016-10-01

    Cardiotoxins (CTXs) belonging to the three-finger toxin superfamily of snake venoms are one of principal toxic components and the protein toxins exhibit membrane lytic activities when the venoms are injected into victims. In the present study, complex formations between CTX VI (a P-type CTX from Naja atra) and CTX1 (an S-type CTX from Naja naja) on zwitterionic POPC bilayers (a major lipid component of cell membranes) have been studied in near physiological conditions for a total dynamic time scale of 1.35 μs using all-atom molecular dynamics (MD) simulations. Comprehensive analyses of the MD data revealed that residues such as Leu1, Lys2, Tyr11, Lys31, Asp57 and Arg58 of CTX VI, and Ala16, Lys30 and Arg58 of CTX1 were crucial for establishing interactions with the POPC bilayer. Moreover, loop I, along with globular head and loop II of CTX VI, and loop II of CTX1 were found to be the structural regions chiefly governing complex formation of the respective proteins with POPC. Rationalizations for the differential binding modes of CTXs and implications of the findings for designing small molecular inhibitors to the toxins are also discussed. Graphical Abstract Binding modes of a P-type CTX and an S-type CTX towards the POPC bilayer. PMID:27628673

  20. Discriminate protein decoys from native by using a scoring function based on ubiquitous Phi and Psi angles computed for all atom.

    PubMed

    Mishra, Avdesh; Iqbal, Sumaiya; Hoque, Md Tamjidul

    2016-06-01

    The success of solving the protein folding and structure prediction problems in molecular and structural biology relies on an accurate energy function. With the rapid advancement in the computational biology and bioinformatics fields, there is a growing need of solving unknown fold and structure faster and thus an accurate energy function is indispensable. To address this need, we develop a new potential function, namely 3DIGARS3.0, which is a linearly weighted combination of 3DIGARS, mined accessible surface area (ASA) and ubiquitously computed Phi (uPhi) and Psi (uPsi) energies - optimized by a Genetic Algorithm (GA). We use a dataset of 4332 protein-structures to generate uPhi and uPsi based score libraries to be used within the core 3DIGARS method. The optimized weight of each component is obtained by applying Genetic Algorithm based optimization on three challenging decoy sets. The improved 3DIGARS3.0 outperformed state-of-the-art methods significantly based on a set of independent test datasets.

  1. An effective all-atom potential for proteins

    PubMed Central

    Irbäck, Anders; Mitternacht, Simon; Mohanty, Sandipan

    2009-01-01

    We describe and test an implicit solvent all-atom potential for simulations of protein folding and aggregation. The potential is developed through studies of structural and thermodynamic properties of 17 peptides with diverse secondary structure. Results obtained using the final form of the potential are presented for all these peptides. The same model, with unchanged parameters, is furthermore applied to a heterodimeric coiled-coil system, a mixed α/β protein and a three-helix-bundle protein, with very good results. The computational efficiency of the potential makes it possible to investigate the free-energy landscape of these 49–67-residue systems with high statistical accuracy, using only modest computational resources by today's standards. PACS Codes: 87.14.E-, 87.15.A-, 87.15.Cc PMID:19356242

  2. Parameterization of backbone flexibility in a coarse-grained force field for proteins (COFFDROP) derived from all-atom explicit-solvent molecular dynamics simulations of all possible two-residue peptides

    PubMed Central

    Frembgen-Kesner, Tamara; Andrews, Casey T.; Li, Shuxiang; Ngo, Nguyet Anh; Shubert, Scott A.; Jain, Aakash; Olayiwola, Oluwatoni; Weishaar, Mitch R.; Elcock, Adrian H.

    2015-01-01

    Recently, we reported the parameterization of a set of coarse-grained (CG) nonbonded potential functions, derived from all-atom explicit-solvent molecular dynamics (MD) simulations of amino acid pairs, and designed for use in (implicit-solvent) Brownian dynamics (BD) simulations of proteins; this force field was named COFFDROP (COarse-grained Force Field for Dynamic Representations Of Proteins). Here, we describe the extension of COFFDROP to include bonded backbone terms derived from fitting to results of explicit-solvent MD simulations of all possible two-residue peptides containing the 20 standard amino acids, with histidine modeled in both its protonated and neutral forms. The iterative Boltzmann inversion (IBI) method was used to optimize new CG potential functions for backbone-related terms by attempting to reproduce angle, dihedral and distance probability distributions generated by the MD simulations. In a simple test of the transferability of the extended force field, the angle, dihedral and distance probability distributions obtained from BD simulations of 56 three-residue peptides were compared to results from corresponding explicit-solvent MD simulations. In a more challenging test of the COFFDROP force field, it was used to simulate eight intrinsically disordered proteins and was shown to quite accurately reproduce the experimental hydrodynamic radii (Rhydro), provided that the favorable nonbonded interactions of the force field were uniformly scaled downwards in magnitude. Overall, the results indicate that the COFFDROP force field is likely to find use in modeling the conformational behavior of intrinsically disordered proteins and multi-domain proteins connected by flexible linkers. PMID:26574429

  3. Learning through Computer Simulations.

    ERIC Educational Resources Information Center

    Braun, Ludwig

    Prior to the relatively easy access to computers which began in the mid-1960's, simulation was a tool only of researchers. Even now, students are frequently excluded from direct laboratory experiences for many reasons. However, computer simulation can open up these experiences, providing a powerful teaching tool for individuals, for small and…

  4. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  5. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1977-01-01

    In a computer simulation study of earthquakes a seismically active strike slip fault is represented by coupled mechanical blocks which are driven by a moving plate and which slide on a friction surface. Elastic forces and time independent friction are used to generate main shock events, while viscoelastic forces and time dependent friction add aftershock features. The study reveals that the size, length, and time and place of event occurrence are strongly influenced by the magnitude and degree of homogeneity in the elastic, viscous, and friction parameters of the fault region. For example, periodically reoccurring similar events are observed in simulations with near-homogeneous parameters along the fault, whereas seismic gaps are a common feature of simulations employing large variations in the fault parameters. The study also reveals correlations between strain energy release and fault length and average displacement and between main shock and aftershock displacements.

  6. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1976-01-01

    Two computer simulation models of earthquakes were studied for the dependence of the pattern of events on the model assumptions and input parameters. Both models represent the seismically active region by mechanical blocks which are connected to one another and to a driving plate. The blocks slide on a friction surface. In the first model elastic forces were employed and time independent friction to simulate main shock events. The size, length, and time and place of event occurrence were influenced strongly by the magnitude and degree of homogeniety in the elastic and friction parameters of the fault region. Periodically reoccurring similar events were frequently observed in simulations with near homogeneous parameters along the fault, whereas, seismic gaps were a common feature of simulations employing large variations in the fault parameters. The second model incorporated viscoelastic forces and time-dependent friction to account for aftershock sequences. The periods between aftershock events increased with time and the aftershock region was confined to that which moved in the main event.

  7. Local order parameters for use in driving homogeneous ice nucleation with all-atom models of water.

    PubMed

    Reinhardt, Aleks; Doye, Jonathan P K; Noya, Eva G; Vega, Carlos

    2012-11-21

    We present a local order parameter based on the standard Steinhardt-Ten Wolde approach that is capable both of tracking and of driving homogeneous ice nucleation in simulations of all-atom models of water. We demonstrate that it is capable of forcing the growth of ice nuclei in supercooled liquid water simulated using the TIP4P/2005 model using over-biassed umbrella sampling Monte Carlo simulations. However, even with such an order parameter, the dynamics of ice growth in deeply supercooled liquid water in all-atom models of water are shown to be very slow, and so the computation of free energy landscapes and nucleation rates remains extremely challenging.

  8. All-Atom Molecular Dynamics of Virus Capsids as Drug Targets

    PubMed Central

    2016-01-01

    Virus capsids are protein shells that package the viral genome. Although their morphology and biological functions can vary markedly, capsids often play critical roles in regulating viral infection pathways. A detailed knowledge of virus capsids, including their dynamic structure, interactions with cellular factors, and the specific roles that they play in the replication cycle, is imperative for the development of antiviral therapeutics. The following Perspective introduces an emerging area of computational biology that focuses on the dynamics of virus capsids and capsid–protein assemblies, with particular emphasis on the effects of small-molecule drug binding on capsid structure, stability, and allosteric pathways. When performed at chemical detail, molecular dynamics simulations can reveal subtle changes in virus capsids induced by drug molecules a fraction of their size. Here, the current challenges of performing all-atom capsid–drug simulations are discussed, along with an outlook on the applicability of virus capsid simulations to reveal novel drug targets. PMID:27128262

  9. All-atom force field for molecular dynamics simulations on organotransition metal solids and liquids. Application to M(CO)(n) (M = Cr, Fe, Ni, Mo, Ru, or W) compounds.

    PubMed

    Bernardes, Carlos E S; Canongia Lopes, José N; Minas da Piedade, Manuel E

    2013-10-31

    A previously developed OPLS-based all-atom force field for organometallic compounds was extended to a series of first-, second-, and third-row transition metals based on the study of M(CO)(n) (M = Cr, Fe, Ni, Mo, Ru, or W) complexes. For materials that are solid at ambient temperature and pressure (M = Cr, Mo, W) the validation of the force field was based on reported structural data and on the standard molar enthalpies of sublimation at 298.15 K, experimentally determined by Calvet-drop microcalorimetry using samples corresponding to a specific and well-characterized crystalline phase: Δ(sub)H(m)° = 72.6 ± 0.3 kJ·mol(–1) for Cr(CO)(6), 73.4 ± 0.3 kJ·mol(–1) for Mo(CO)(6), and 77.8 ± 0.3 kJ·mol(–1) for W(CO)(6). For liquids, where problems of polymorphism or phase mixtures are absent, critically analyzed literature data were used. The force field was able to reproduce the volumetric properties of the test set (density and unit cell volume) with an average deviations smaller than 2% and the experimentally determined enthalpies of sublimation and vaporization with an accuracy better than 2.3 kJ·mol(–1). The Lennard-Jones (12-6) potential function parameters used to calculate the repulsive and dispersion contributions of the metals within the framework of the force field were found to be transferable between chromium, iron, and nickel (first row) and between molybdenum and ruthenium (second row). PMID:24079472

  10. Grid computing and biomolecular simulation.

    PubMed

    Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W

    2005-08-15

    Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.

  11. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  12. A simple and transferable all-atom/coarse-grained hybrid model to study membrane processes.

    PubMed

    Genheden, Samuel; Essex, Jonathan W

    2015-10-13

    We present an efficient all-atom/coarse-grained hybrid model and apply it to membrane processes. This model is an extension of the all-atom/ELBA model applied previously to processes in water. Here, we improve the efficiency of the model by implementing a multiple-time step integrator that allows the atoms and the coarse-grained beads to be propagated at different timesteps. Furthermore, we fine-tune the interaction between the atoms and the coarse-grained beads by computing the potential of mean force of amino acid side chain analogs along the membrane normal and comparing to atomistic simulations. The model was independently validated on the calculation of small-molecule partition coefficients. Finally, we apply the model to membrane peptides. We studied the tilt angle of the Walp23 and Kalp23 helices in two different model membranes and the stability of the glycophorin A dimer. The model is efficient, accurate, and straightforward to use, as it does not require any extra interaction particles, layers of atomistic solvent molecules or tabulated potentials, thus offering a novel, simple approach to study membrane processes. PMID:26574264

  13. Computer Simulation of Mutagenesis.

    ERIC Educational Resources Information Center

    North, J. C.; Dent, M. T.

    1978-01-01

    A FORTRAN program is described which simulates point-substitution mutations in the DNA strands of typical organisms. Its objective is to help students to understand the significance and structure of the genetic code, and the mechanisms and effect of mutagenesis. (Author/BB)

  14. Composite Erosion by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    Composite degradation is evaluated by computational simulation when the erosion degradation occurs on a ply-by-ply basis and the degrading medium (device) is normal to the ply. The computational simulation is performed by a multi factor interaction model and by a multi scale and multi physics available computer code. The erosion process degrades both the fiber and the matrix simultaneously in the same slice (ply). Both the fiber volume ratio and the matrix volume ratio approach zero while the void volume ratio increases as the ply degrades. The multi factor interaction model simulates the erosion degradation, provided that the exponents and factor ratios are selected judiciously. Results obtained by the computational composite mechanics show that most composite characterization properties degrade monotonically and approach "zero" as the ply degrades completely.

  15. Computer Simulation of Aircraft Aerodynamics

    NASA Technical Reports Server (NTRS)

    Inouye, Mamoru

    1989-01-01

    The role of Ames Research Center in conducting basic aerodynamics research through computer simulations is described. The computer facilities, including supercomputers and peripheral equipment that represent the state of the art, are described. The methodology of computational fluid dynamics is explained briefly. Fundamental studies of turbulence and transition are being pursued to understand these phenomena and to develop models that can be used in the solution of the Reynolds-averaged Navier-Stokes equations. Four applications of computer simulations for aerodynamics problems are described: subsonic flow around a fuselage at high angle of attack, subsonic flow through a turbine stator-rotor stage, transonic flow around a flexible swept wing, and transonic flow around a wing-body configuration that includes an inlet and a tail.

  16. Taxis through Computer Simulation Programs.

    ERIC Educational Resources Information Center

    Park, David

    1983-01-01

    Describes a sequence of five computer programs (listings for Apple II available from author) on tactic responses (oriented movement of a cell, cell group, or whole organism in reponse to stimuli). The simulation programs are useful in helping students examine mechanisms at work in real organisms. (JN)

  17. Computer Simulation of Diffraction Patterns.

    ERIC Educational Resources Information Center

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  18. Quantum Mechanics/Molecular Mechanics Method Combined with Hybrid All-Atom and Coarse-Grained Model: Theory and Application on Redox Potential Calculations.

    PubMed

    Shen, Lin; Yang, Weitao

    2016-04-12

    We developed a new multiresolution method that spans three levels of resolution with quantum mechanical, atomistic molecular mechanical, and coarse-grained models. The resolution-adapted all-atom and coarse-grained water model, in which an all-atom structural description of the entire system is maintained during the simulations, is combined with the ab initio quantum mechanics and molecular mechanics method. We apply this model to calculate the redox potentials of the aqueous ruthenium and iron complexes by using the fractional number of electrons approach and thermodynamic integration simulations. The redox potentials are recovered in excellent accordance with the experimental data. The speed-up of the hybrid all-atom and coarse-grained water model renders it computationally more attractive. The accuracy depends on the hybrid all-atom and coarse-grained water model used in the combined quantum mechanical and molecular mechanical method. We have used another multiresolution model, in which an atomic-level layer of water molecules around redox center is solvated in supramolecular coarse-grained waters for the redox potential calculations. Compared with the experimental data, this alternative multilayer model leads to less accurate results when used with the coarse-grained polarizable MARTINI water or big multipole water model for the coarse-grained layer.

  19. Multiscale modeling and computer simulation of polyhedral oligomeric silsesquioxane assemblies

    NASA Astrophysics Data System (ADS)

    Chan, Elaine R.

    Self-assembly offers a promising strategy for manipulating the bottom-up assembly of nanometer-scale objects into useful structures for many diverse applications. Polyhedral oligomeric silsesquioxane (POSS) molecules are nanoscale building blocks with immense potential for constructing hybrid organic/inorganic materials with superior physical properties. The silicon corners of the inorganic nanocubes can be functionalized with a variety of organic tethers to precisely tailor assembly of the molecules into specific structures. To successfully control fabrication of POSS-based materials requires an understanding of the atomic- and nanoscale processes that occur during the assembly process. In conjunction with ongoing experiments, computer simulations and theory can provide fundamental insight into the self-assembly process, and are valuable tools for identifying and efficiently mapping the vast parameter space of complex POSS/polymer assemblies. The objective of this dissertation is to elucidate the self-assembly properties of polymer-tethered POSS at large length (˜100 nanometers) and time (˜10--100 nanoseconds) scales. These length and time scales are often difficult to assess experimentally. Simulation studies of self-assembly in these regimes require sufficiently large numbers of molecules, and coarse-grained mesoscale models have been developed based on electronic structure calculations and all-atom simulations of small numbers of molecules to reduce overall computation time. Model molecules are initially developed that capture the essential features of connectivity and interaction specificity of mono- and tetratethered POSS nanoparticles functionalized with block copolymer and homopolymer chains. Simulations of these model molecules are conducted over wide ranges of temperature and concentration to probe the influence of tether chemical composition, molecular weight, and number on self-assembly. The tethered POSS systems are predicted to exhibit several of

  20. Folding of proteins with an all-atom Go-model.

    PubMed

    Wu, L; Zhang, J; Qin, M; Liu, F; Wang, W

    2008-06-21

    The Go-like potential at a residual level has been successfully applied to the folding of proteins in many previous works. However, taking into consideration more detailed structural information in the atomic level, the definition of contacts used in these traditional Go-models may not be suitable for all-atom simulations. Here, in this work, we develop a rational definition of contacts considering the screening effect in the crowded intramolecular environment. In such a scheme, a large amount of screened atom pairs are excluded and the number of contacts is decreased compared to the case of the traditional definition. These contacts defined by such a new definition are compatible with the all-atom representation of protein structures. To verify the rationality of the new definition of contacts, the folding of proteins CI2 and SH3 is simulated by all-atom molecular dynamics simulations. A high folding cooperativity and good correlation of the simulated Phi-values with those obtained experimentally, especially for CI2, are found. This suggests that the all-atom Go-model is improved compared to the traditional Go-model. Based on the comparison of the Phi-values, the roles of side chains in the folding are discussed, and it is concluded that the side-chain structures are more important for local contacts in determining the transition state structures. Moreover, the relations between side chain and backbone orderings are also discussed.

  1. Deriving Coarse-Grained Charges from All-Atom Systems: An Analytic Solution.

    PubMed

    McCullagh, Peter; Lake, Peter T; McCullagh, Martin

    2016-09-13

    An analytic method to assign optimal coarse-grained charges based on electrostatic potential matching is presented. This solution is the infinite size and density limit of grid-integration charge-fitting and is computationally more efficient by several orders of magnitude. The solution is also minimized with respect to coarse-grained positions which proves to be an extremely important step in reproducing the all-atom electrostatic potential. The joint optimal-charge optimal-position coarse-graining procedure is applied to a number of aggregating proteins using single-site per amino acid resolution. These models provide a good estimate of both the vacuum and Debye-Hückel screened all-atom electrostatic potentials in the vicinity and in the far-field of the protein. Additionally, these coarse-grained models are shown to approximate the all-atom dimerization electrostatic potential energy of 10 aggregating proteins with good accuracy.

  2. Computer simulation of liquid metals

    NASA Astrophysics Data System (ADS)

    Belashchenko, D. K.

    2013-12-01

    Methods for and the results of the computer simulation of liquid metals are reviewed. Two basic methods, classical molecular dynamics with known interparticle potentials and the ab initio method, are considered. Most attention is given to the simulated results obtained using the embedded atom model (EAM). The thermodynamic, structural, and diffusion properties of liquid metal models under normal and extreme (shock) pressure conditions are considered. Liquid-metal simulated results for the Groups I - IV elements, a number of transition metals, and some binary systems (Fe - C, Fe - S) are examined. Possibilities for the simulation to account for the thermal contribution of delocalized electrons to energy and pressure are considered. Solidification features of supercooled metals are also discussed.

  3. Computer simulation of martensitic transformations

    SciTech Connect

    Xu, Ping

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  4. Computer simulation and scientific visualization

    SciTech Connect

    Weber, D.P.; Moszur, F.M.

    1990-01-01

    The simulation of processes in engineering and the physical sciences has progressed rapidly over the last several years. With rapid developments in supercomputers, parallel processing, numerical algorithms and software, scientists and engineers are now positioned to quantitatively simulate systems requiring many billions of arithmetic operations. The need to understand and assimilate such massive amounts of data has been a driving force in the development of both hardware and software to create visual representations of the underling physical systems. In this paper, and the accompanying videotape, the evolution and development of the visualization process in scientific computing will be reviewed. Specific applications and associated imaging hardware and software technology illustrate both the computational needs and the evolving trends. 6 refs.

  5. Biomes computed from simulated climatologies

    NASA Astrophysics Data System (ADS)

    Claussen, Martin; Esch, Monika

    1994-01-01

    The biome model of Prentice et al. (1992a) is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fur Meteorologie. This study is undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to failures in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are seen for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced C02 concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting changes in vegetation patterns due to a rapid climate change, the latter simulation has to be taken as a prediction of changes in conditions favourable for the existence of certain biomes, not as a prediction of a future distribution of biomes.[/ab

  6. Biomes computed from simulated climatologies

    SciTech Connect

    Claussen, M.; Esch, M.

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

  7. Computer simulation of nonequilibrium processes

    SciTech Connect

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.

  8. Inversion based on computational simulations

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  9. An FFT-based method for modeling protein folding and binding under crowding: benchmarking on ellipsoidal and all-atom crowders.

    PubMed

    Qin, Sanbo; Zhou, Huan-Xiang

    2013-10-01

    It is now well recognized that macromolecular crowding can exert significant effects on protein folding and binding stability. In order to calculate such effects in direct simulations of proteins mixed with bystander macromolecules, the latter (referred to as crowders) are usually modeled as spheres and the proteins represented at a coarse-grained level. Our recently developed postprocessing approach allows the proteins to be represented at the all-atom level but, for computational efficiency, has only been implemented for spherical crowders. Modeling crowder molecules in cellular environments and in vitro experiments as spheres may distort their effects on protein stability. Here we present a new method that is capable for treating aspherical crowders. The idea, borrowed from protein-protein docking, is to calculate the excess chemical potential of the proteins in crowded solution by fast Fourier transform (FFT). As the first application, we studied the effects of ellipsoidal crowders on the folding and binding free energies of all-atom proteins, and found, in agreement with previous direct simulations with coarse-grained protein models, that the aspherical crowders exert greater stabilization effects than spherical crowders of the same volume. Moreover, as demonstrated here, the FFT-based method has the important property that its computational cost does not increase strongly even when the level of details in representing the crowders is increased all the way to all-atom, thus significantly accelerating realistic modeling of protein folding and binding in cell-like environments. PMID:24187527

  10. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  11. A hierarchical coarse-grained (all-atom to all residue) approach to peptides (P1, P2) binding with a graphene sheet

    NASA Astrophysics Data System (ADS)

    Pandey, Ras; Kuang, Zhifeng; Farmer, Barry; Kim, Sang; Naik, Rajesh

    2012-02-01

    Recently, Kim et al. [1] have found that peptides P1: HSSYWYAFNNKT and P2: EPLQLKM bind selectively to graphene surfaces and edges respectively which are critical in modulating both the mechanical as well as electronic transport properties of graphene. Such distinctions in binding sites (edge versus surface) observed in electron micrographs were verified by computer simulation by an all-atomic model that captures the pi-pi bonding. We propose a hierarchical approach that involves input from the all-atom Molecular Dynamics (MD) study (with atomistic detail) into a coarse-grained Monte Carlo simulation to extend this study further to a larger scale. The binding energy of a free amino acid with the graphene sheet from all-atom simulation is used in the interaction parameter for the coarse-grained approach. Peptide chain executes its stochastic motion with the Metropolis algorithm. We investigate a number of local and global physical quantities and find that peptide P1 is likely to bind more strongly to graphene sheet than P2 and that it is anchored by three residues ^4Y^5W^6Y. [1] S.N. Kim et al J. Am. Chem. Soc. 133, 14480 (2011).

  12. Computer simulation of polymer surfaces

    NASA Astrophysics Data System (ADS)

    Jang, Jee Hwan

    One of the main objectives of computer simulation is to isolate the effect of a specific variable in a physical or chemical system of interest, but with ambiguity in experimental interpretation. The area of polymer surface or interface contains such an ambiguity due to absence of a major thermodynamic driving force and difficulty of the complete control of experimental design. Considering the length and the time scales that define a phenomenon observed in polymeric systems, the appropriate choice of a method among the currently available methodologies in computational chemistry that have been developed mostly for small molecules is very demanding because of the Imitation of computational resources. In this study, a computationally efficient Monte Carlo simulation on a high coordination lattice employing the RIS scheme for short range interactions and a Lennard-Jones potential for long-range interaction has been applied to various boundary situations which define the material status and distinguish the properties of the material at an interface or surface from those in the bulk state. The polymer surfaces of interest in this study include a free polymer surface, a surface near an attractive solid substrate, a polymer surface generated by compression between two repulsive hard walls, and a polymer-polymer interface. Several focuses are on the change of the static properties and dynamic properties at the interfaces, which includes density profiles, distribution of a specific constituent of a polymer chain at the interfaces, chain orientation, local conformational state, and chain diffusivity. Each property at an interface is greatly affected by the characteristic of the imposed heterogeneity. One common feature is that the chains are confined at an interface along the direction normal to a surface regardless of the detailed nature of the surface. In addition, the effect of a surface gradually diminishes toward a bulk region and each property has its own effective

  13. Computer simulation of microstructural dynamics

    SciTech Connect

    Grest, G.S.; Anderson, M.P.; Srolovitz, D.J.

    1985-01-01

    Since many of the physical properties of materials are determined by their microstructure, it is important to be able to predict and control microstructural development. A number of approaches have been taken to study this problem, but they assume that the grains can be described as spherical or hexagonal and that growth occurs in an average environment. We have developed a new technique to bridge the gap between the atomistic interactions and the macroscopic scale by discretizing the continuum system such that the microstructure retains its topological connectedness, yet is amenable to computer simulations. Using this technique, we have studied grain growth in polycrystalline aggregates. The temporal evolution and grain morphology of our model are in excellent agreement with experimental results for metals and ceramics.

  14. Priority Queues for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.

  15. Computer simulations of liquid crystals

    NASA Astrophysics Data System (ADS)

    Smondyrev, Alexander M.

    Liquid crystal physics is an exciting interdisciplinary field of research with important practical applications. Their complexity and the presence of strong translational and orientational fluctuations require a computational approach, especially in the studies of nonequlibrium phenomena. In this dissertation we present the results of computer simulation studies of liquid crystals using the molecular dynamics technique. We employed the Gay-Berne phenomenological model of liquid crystals to describe the interaction between the molecules. Both equilibrium and non-equilibrium phenomena were studied. In the first case we studied the flow properties of the liquid crystal system in equilibrium as well as the dynamics of the director. We measured the viscosities of the Gay-Berne model in the nematic and isotropic phases. The temperature-dependence of the rotational and shear viscosities, including the nonmonotonic behavior of one shear viscosity, are in good agreement with experimental data. The bulk viscosities are significantly larger than the shear viscosities, again in agreement with experiment. The director motion was found to be ballistic at short times and diffusive at longer times. The second class of problems we focused on is the properties of the system which was rapidly quenched to very low temperatures from the nematic phase. We find a glass transition to a metastable phase with nematic order and frozen translational and orientational degrees of freedom. For fast quench rates the local structure is nematic-like, while for slower quench rates smectic order is present as well. Finally, we considered a system in the isotropic phase which is then cooled to temperatures below the isotropic-nematic transition temperature. We expect topological defects to play a central role in the subsequent equilibration of the system. To identify and study these defects we require a simulation of a system with several thousand particles. We present the results of large

  16. The Shuttle Mission Simulator computer generated imagery

    NASA Technical Reports Server (NTRS)

    Henderson, T. H.

    1984-01-01

    Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.

  17. Simulating Drosophila Genetics with the Computer.

    ERIC Educational Resources Information Center

    Small, James W., Jr.; Edwards, Kathryn L.

    1979-01-01

    Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)

  18. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  19. Monte Carlo Computer Simulation of a Rainbow.

    ERIC Educational Resources Information Center

    Olson, Donald; And Others

    1990-01-01

    Discusses making a computer-simulated rainbow using principles of physics, such as reflection and refraction. Provides BASIC program for the simulation. Appends a program illustrating the effects of dispersion of the colors. (YP)

  20. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  1. An all-atom force field developed for Zn₄O(RCO₂)₆ metal organic frameworks.

    PubMed

    Sun, Yingxin; Sun, Huai

    2014-03-01

    An all-atom force field is developed for metal organic frameworks Zn₄O(RCO₂)₆ by fitting to quantum mechanics data. Molecular simulations are conducted to validate the force field by calculating thermal expansion coefficients, crystal bulk and Young's moduli, power spectra, self-diffusion coefficients, and activation energies of self-diffusions for benzene and n-hexane. The calculated results are in good agreement with available experimental data. The proposed force field is suitable for simulations of adsorption or diffusion of organic molecules with flexible frameworks. PMID:24562858

  2. Computer Simulation in Chemical Kinetics

    ERIC Educational Resources Information Center

    Anderson, Jay Martin

    1976-01-01

    Discusses the use of the System Dynamics technique in simulating a chemical reaction for kinetic analysis. Also discusses the use of simulation modelling in biology, ecology, and the social sciences, where experimentation may be impractical or impossible. (MLH)

  3. Computer Simulation in Undergraduate Instruction: A Symposium.

    ERIC Educational Resources Information Center

    Street, Warren R.; And Others

    These symposium papers discuss the instructional use of computers in psychology, with emphasis on computer-produced simulations. The first, by Rich Edwards, briefly outlines LABSIM, a general purpose system of FORTRAN programs which simulate data collection in more than a dozen experimental models in psychology and are designed to train students…

  4. Computer simulation of inhibitor application -- A review

    SciTech Connect

    Banerjee, G.; Vasanth, K.L.

    1997-12-01

    The rapid development of powerful software as well as hardware in computer technology has changed the traditional approach to all areas of science and technology. In the field of corrosion inhibitors, computers are used to model, simulate, analyze and monitor inhibitor applications in both laboratory and industrial environments. This paper will present an up-to-date critical review of such simulation studies.

  5. Recent advances in computer image generation simulation.

    PubMed

    Geltmacher, H E

    1988-11-01

    An explosion in flight simulator technology over the past 10 years is revolutionizing U.S. Air Force (USAF) operational training. The single, most important development has been in computer image generation. However, other significant advances are being made in simulator handling qualities, real-time computation systems, and electro-optical displays. These developments hold great promise for achieving high fidelity combat mission simulation. This article reviews the progress to date and predicts its impact, along with that of new computer science advances such as very high speed integrated circuits (VHSIC), on future USAF aircrew simulator training. Some exciting possibilities are multiship, full-mission simulators at replacement training units, miniaturized unit level mission rehearsal training simulators, onboard embedded training capability, and national scale simulator networking.

  6. Computer simulation of nonequilibrium processes

    SciTech Connect

    Hoover, W.G.; Moran, B.; Holian, B.L.; Posch, H.A.; Bestiale, S.

    1987-01-01

    Recent atomistic simulations of irreversible macroscopic hydrodynamic flows are illustrated. An extension of Nose's reversible atomistic mechanics makes it possible to simulate such non-equilibrium systems with completely reversible equations of motion. The new techniques show that macroscopic irreversibility is a natural inevitable consequence of time-reversible Lyapunov-unstable microscopic equations of motion.

  7. Filtration theory using computer simulations

    SciTech Connect

    Bergman, W.; Corey, I.

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  8. Filtration theory using computer simulations

    SciTech Connect

    Bergman, W.; Corey, I.

    1997-01-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three- dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements.

  9. Evaluation of Visual Computer Simulator for Computer Architecture Education

    ERIC Educational Resources Information Center

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  10. Computational Spectrum of Agent Model Simulation

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    The study of human social behavioral systems is finding renewed interest in military, homeland security and other applications. Simulation is the most generally applied approach to studying complex scenarios in such systems. Here, we outline some of the important considerations that underlie the computational aspects of simulation-based study of human social systems. The fundamental imprecision underlying questions and answers in social science makes it necessary to carefully distinguish among different simulation problem classes and to identify the most pertinent set of computational dimensions associated with those classes. We identify a few such classes and present their computational implications. The focus is then shifted to the most challenging combinations in the computational spectrum, namely, large-scale entity counts at moderate to high levels of fidelity. Recent developments in furthering the state-of-the-art in these challenging cases are outlined. A case study of large-scale agent simulation is provided in simulating large numbers (millions) of social entities at real-time speeds on inexpensive hardware. Recent computational results are identified that highlight the potential of modern high-end computing platforms to push the envelope with respect to speed, scale and fidelity of social system simulations. Finally, the problem of shielding the modeler or domain expert from the complex computational aspects is discussed and a few potential solution approaches are identified.

  11. Folding peptides and proteins with all-atom physics: methods and applications

    NASA Astrophysics Data System (ADS)

    Shell, M. Scott

    2008-03-01

    Computational methods offer powerful tools for investigating proteins and peptides at the molecular-level; however, it has proven challenging to reproduce the long time scale folding processes of these molecules at a level that is both faithful to the atomic driving forces and attainable with modern commodity cluster computing. Alternatively, the past decade has seen significant progress in using bioinformatics-based approaches to infer the three dimensional native structures of proteins, drawing upon extensive knowledge databases of known protein structures [1]. These methods work remarkably well when a homologous protein can be found to provide a structural template for a candidate sequence. However, in cases where homology to database proteins is low, where the folding pathway is of interest, or where conformational flexibility is substantial---as in many emerging protein and peptide technologies---bioinformatics methods perform poorly. There is therefore great interest in seeing purely physics-based approaches succeed. We discuss a purely physics-based, database-free folding method, relying on proper thermal sampling (replica exchange molecular dynamics) and molecular potential energy functions. In order to surmount the tremendous computational demands of all-atom folding simulations, our approach implements a conformational search strategy based on a putative protein folding mechanism called zipping and assembly [2-4]. That is, we explicitly seek out potential folding pathways inferred from short simulations, and iteratively pursue all such routes by coaxing a polypeptide chain along them. The method is called the Zipping and Assembly Method (ZAM) and it works in two parts: (1) the full polypeptide chain is broken into small fragments that are first simulated independently and then successively re-assembled into larger segments with further sampling, and (2) consistently stable structure in fragments is detected and locked into place, in order to avoid re

  12. Computer Simulation and ESL Reading.

    ERIC Educational Resources Information Center

    Wu, Mary A.

    It is noted that although two approaches to second language instruction--the communicative approach emphasizing genuine language use and computer assisted instruction--have come together in the form of some lower level reading instruction materials for English as a second language (ESL), advanced level ESL reading materials using computer…

  13. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  14. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  15. Computer simulation of engine systems

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1980-01-01

    The use of computerized simulations of the steady state and transient performance of jet engines throughout the flight regime is discussed. In addition, installation effects on thrust and specific fuel consumption is accounted for as well as engine weight, dimensions and cost. The availability throughout the government and industry of analytical methods for calculating these quantities are pointed out.

  16. Astronomy Simulation with Computer Graphics.

    ERIC Educational Resources Information Center

    Thomas, William E.

    1982-01-01

    "Planetary Motion Simulations" is a system of programs designed for students to observe motions of a superior planet (one whose orbit lies outside the orbit of the earth). Programs run on the Apple II microcomputer and employ high-resolution graphics to present the motions of Saturn. (Author/JN)

  17. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  18. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  19. Teaching Environmental Systems Modelling Using Computer Simulation.

    ERIC Educational Resources Information Center

    Moffatt, Ian

    1986-01-01

    A computer modeling course in environmental systems and dynamics is presented. The course teaches senior undergraduates to analyze a system of interest, construct a system flow chart, and write computer programs to simulate real world environmental processes. An example is presented along with a course evaluation, figures, tables, and references.…

  20. Computer Simulation Of A Small Turboshaft Engine

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.

    1991-01-01

    Component-type mathematical model of small turboshaft engine developed for use in real-time computer simulations of dynamics of helicopter flight. Yields shaft speeds, torques, fuel-consumption rates, and other operating parameters with sufficient accuracy for use in real-time simulation of maneuvers involving large transients in power and/or severe accelerations.

  1. Salesperson Ethics: An Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  2. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  3. Reservoir Thermal Recover Simulation on Parallel Computers

    NASA Astrophysics Data System (ADS)

    Li, Baoyan; Ma, Yuanle

    The rapid development of parallel computers has provided a hardware background for massive refine reservoir simulation. However, the lack of parallel reservoir simulation software has blocked the application of parallel computers on reservoir simulation. Although a variety of parallel methods have been studied and applied to black oil, compositional, and chemical model numerical simulations, there has been limited parallel software available for reservoir simulation. Especially, the parallelization study of reservoir thermal recovery simulation has not been fully carried out, because of the complexity of its models and algorithms. The authors make use of the message passing interface (MPI) standard communication library, the domain decomposition method, the block Jacobi iteration algorithm, and the dynamic memory allocation technique to parallelize their serial thermal recovery simulation software NUMSIP, which is being used in petroleum industry in China. The parallel software PNUMSIP was tested on both IBM SP2 and Dawn 1000A distributed-memory parallel computers. The experiment results show that the parallelization of I/O has great effects on the efficiency of parallel software PNUMSIP; the data communication bandwidth is also an important factor, which has an influence on software efficiency. Keywords: domain decomposition method, block Jacobi iteration algorithm, reservoir thermal recovery simulation, distributed-memory parallel computer

  4. A coarse-grained protein-protein potential derived from an all-atom force field.

    PubMed

    Basdevant, Nathalie; Borgis, Daniel; Ha-Duong, Tap

    2007-08-01

    In order to study protein-protein nonbonded interactions, we present the development of a new reduced protein model that represents each amino acid residue with one to three coarse grains, whose physical properties are derived in a consistent bottom-up procedure from the higher-resolution all-atom AMBER force field. The resulting potential energy function is pairwise additive and includes distinct van-der-Waals and Coulombic terms. The van-der-Waals effective interactions are deduced from preliminary molecular dynamics simulations of all possible amino acid homodimers. They are best represented by a soft 1/r6 repulsion and a Gaussian attraction, with parameters obeying Lorentz-Berthelot mixing rules. For the Coulombic interaction, coarse grain charges are optimized for each separate protein in order to best represent the all-atom electrostatic potential outside the protein core. This approach leaves the possibility of using any implicit solvent model to describe solvation effects and electrostatic screening. The coarse-grained force field is tested carefully for a small homodimeric complex, the magainin. It is shown to reproduce satisfactorily the specificity of the all-atom underlying potential, in particular within a PB/SA solvation model. The coarse-grained potential is applied to the redocking prediction of three different protein-protein complexes: the magainin dimer, the barnase-barstar, and the trypsin-BPTI complexes. It is shown to provide per se an efficient and discriminating scoring energy function for the protein-protein docking problem that remains pertinent at both the global and refinement stage. PMID:17616119

  5. A coarse-grained protein-protein potential derived from an all-atom force field.

    PubMed

    Basdevant, Nathalie; Borgis, Daniel; Ha-Duong, Tap

    2007-08-01

    In order to study protein-protein nonbonded interactions, we present the development of a new reduced protein model that represents each amino acid residue with one to three coarse grains, whose physical properties are derived in a consistent bottom-up procedure from the higher-resolution all-atom AMBER force field. The resulting potential energy function is pairwise additive and includes distinct van-der-Waals and Coulombic terms. The van-der-Waals effective interactions are deduced from preliminary molecular dynamics simulations of all possible amino acid homodimers. They are best represented by a soft 1/r6 repulsion and a Gaussian attraction, with parameters obeying Lorentz-Berthelot mixing rules. For the Coulombic interaction, coarse grain charges are optimized for each separate protein in order to best represent the all-atom electrostatic potential outside the protein core. This approach leaves the possibility of using any implicit solvent model to describe solvation effects and electrostatic screening. The coarse-grained force field is tested carefully for a small homodimeric complex, the magainin. It is shown to reproduce satisfactorily the specificity of the all-atom underlying potential, in particular within a PB/SA solvation model. The coarse-grained potential is applied to the redocking prediction of three different protein-protein complexes: the magainin dimer, the barnase-barstar, and the trypsin-BPTI complexes. It is shown to provide per se an efficient and discriminating scoring energy function for the protein-protein docking problem that remains pertinent at both the global and refinement stage.

  6. Computer simulation of gear tooth manufacturing processes

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  7. All-atom molecular dynamics analysis of multi-peptide systems reproduces peptide solubility in line with experimental observations

    PubMed Central

    Kuroda, Yutaka; Suenaga, Atsushi; Sato, Yuji; Kosuda, Satoshi; Taiji, Makoto

    2016-01-01

    In order to investigate the contribution of individual amino acids to protein and peptide solubility, we carried out 100 ns molecular dynamics (MD) simulations of 106 Å3 cubic boxes containing ~3 × 104 water molecules and 27 tetra-peptides regularly positioned at 23 Å from each other and composed of a single amino acid type for all natural amino acids but cysteine and glycine. The calculations were performed using Amber with a standard force field on a special purpose MDGRAPE-3 computer, without introducing any “artificial” hydrophobic interactions. Tetra-peptides composed of I, V, L, M, N, Q, F, W, Y, and H formed large amorphous clusters, and those containing A, P, S, and T formed smaller ones. Tetra-peptides made of D, E, K, and R did not cluster at all. These observations correlated well with experimental solubility tendencies as well as hydrophobicity scales with correlation coefficients of 0.5 to > 0.9. Repulsive Coulomb interactions were dominant in ensuring high solubility, whereas both Coulomb and van der Waals (vdW) energies contributed to the aggregations of low solubility amino acids. Overall, this very first all-atom molecular dynamics simulation of a multi-peptide system appears to reproduce the basic properties of peptide solubility, essentially in line with experimental observations. PMID:26817663

  8. Cluster computing software for GATE simulations

    SciTech Connect

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-06-15

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

  9. Computer Code for Nanostructure Simulation

    NASA Technical Reports Server (NTRS)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  10. Computer simulation of bubble formation.

    SciTech Connect

    Insepov, Z.; Bazhirov, T.; Norman, G.; Stegailov, V.; Mathematics and Computer Science; Institute for High Energy Densities of Joint Institute for High Temperatures of RAS

    2007-01-01

    Properties of liquid metals (Li, Pb, Na) containing nanoscale cavities were studied by atomistic Molecular Dynamics (MD). Two atomistic models of cavity simulation were developed that cover a wide area in the phase diagram with negative pressure. In the first model, the thermodynamics of cavity formation, stability and the dynamics of cavity evolution in bulk liquid metals have been studied. Radial densities, pressures, surface tensions, and work functions of nano-scale cavities of various radii were calculated for liquid Li, Na, and Pb at various temperatures and densities, and at small negative pressures near the liquid-gas spinodal, and the work functions for cavity formation in liquid Li were calculated and compared with the available experimental data. The cavitation rate can further be obtained by using the classical nucleation theory (CNT). The second model is based on the stability study and on the kinetics of cavitation of the stretched liquid metals. A MD method was used to simulate cavitation in a metastable Pb and Li melts and determine the stability limits. States at temperatures below critical (T < 0.5Tc) and large negative pressures were considered. The kinetic boundary of liquid phase stability was shown to be different from the spinodal. The kinetics and dynamics of cavitation were studied. The pressure dependences of cavitation frequencies were obtained for several temperatures. The results of MD calculations were compared with estimates based on classical nucleation theory.

  11. Polymer Composites Corrosive Degradation: A Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  12. An Evolutionary Strategy for All-Atom Folding of the 60-Amino-Acid Bacterial Ribosomal Protein L20

    PubMed Central

    Schug, A.; Wenzel, W.

    2006-01-01

    We have investigated an evolutionary algorithm for de novo all-atom folding of the bacterial ribosomal protein L20. We report results of two simulations that converge to near-native conformations of this 60-amino-acid, four-helix protein. We observe a steady increase of “native content” in both simulated ensembles and a large number of near-native conformations in their final populations. We argue that these structures represent a significant fraction of the low-energy metastable conformations, which characterize the folding funnel of this protein. These data validate our all-atom free-energy force field PFF01 for tertiary structure prediction of a previously inaccessible structural family of proteins. We also compare folding simulations of the evolutionary algorithm with the basin-hopping technique for the Trp-cage protein. We find that the evolutionary algorithm generates a dynamic memory in the simulated population, which leads to faster overall convergence. PMID:16565067

  13. Computer simulation of space station computer steered high gain antenna

    NASA Technical Reports Server (NTRS)

    Beach, S. W.

    1973-01-01

    The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.

  14. An All-Atom Model of the Structure of Human Copper Transporter 1

    PubMed Central

    Sharikov, Yuriy; Greenberg, Jerry P.; Miller, Mark A.; Kouznetsova, Valentina L.; Larson, Christopher A.; Howell, Stephen B.

    2013-01-01

    Human copper transporter 1 (hCTR1) is the major high affinity copper influx transporter in mammalian cells that also mediates uptake of the cancer chemotherapeutic agent cisplatin. A low resolution structure of hCTR1 determined by cryoelectron microscopy was recently published. Several protein structure simulation techniques were used to create an all-atom model of this important transporter using the low resolution structure as a starting point. The all-atom model provides new insights into the roles of specific residues of the N-terminal extracellular domain, the intracellular loop, and C-terminal region in metal ion transport. In particular, the model demonstrates that the central region of the pore contains four sets of methionine triads in the intramembranous region. The structure confirms that two triads of methionine residues delineate the intramembranous region of the transporter, and further identifies two additional methionine triads that are located in the extracellular N-terminal part of the transporter. Together, the four triads create a structure that promotes stepwise transport of metal ions into and then through the intramembranous channel of the transporter via transient thioether bonds to methionine residues. Putative copper-binding sites in the hCTR1 trimer were identified by a program developed by us for prediction of metal-binding sites. These sites correspond well with the known effects of mutations on the ability of the protein to transport copper and cisplatin. PMID:22569840

  15. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  16. Computer Series, 108. Computer Simulation of Chemical Equilibrium.

    ERIC Educational Resources Information Center

    Cullen, John F., Jr.

    1989-01-01

    Presented is a computer simulation called "The Great Chemical Bead Game" which can be used to teach the concepts of equilibrium and kinetics to introductory chemistry students more clearly than through an experiment. Discussed are the rules of the game, the application of rate laws and graphical analysis. (CW)

  17. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  18. Enabling Computational Technologies for Terascale Scientific Simulations

    SciTech Connect

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  19. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  20. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  1. Automatic temperature computation for realistic IR simulation

    NASA Astrophysics Data System (ADS)

    Le Goff, Alain; Kersaudy, Philippe; Latger, Jean; Cathala, Thierry; Stolte, Nilo; Barillot, Philippe

    2000-07-01

    Polygon temperature computation in 3D virtual scenes is fundamental for IR image simulation. This article describes in detail the temperature calculation software and its current extensions, briefly presented in [1]. This software, called MURET, is used by the simulation workshop CHORALE of the French DGA. MURET is a one-dimensional thermal software, which accurately takes into account the material thermal attributes of three-dimensional scene and the variation of the environment characteristics (atmosphere) as a function of the time. Concerning the environment, absorbed incident fluxes are computed wavelength by wavelength, for each half an hour, druing 24 hours before the time of the simulation. For each polygon, incident fluxes are compsed of: direct solar fluxes, sky illumination (including diffuse solar fluxes). Concerning the materials, classical thermal attributes are associated to several layers, such as conductivity, absorption, spectral emissivity, density, specific heat, thickness and convection coefficients are taken into account. In the future, MURET will be able to simulate permeable natural materials (water influence) and vegetation natural materials (woods). This model of thermal attributes induces a very accurate polygon temperature computation for the complex 3D databases often found in CHORALE simulations. The kernel of MUET consists of an efficient ray tracer allowing to compute the history (over 24 hours) of the shadowed parts of the 3D scene and a library, responsible for the thermal computations. The great originality concerns the way the heating fluxes are computed. Using ray tracing, the flux received in each 3D point of the scene accurately takes into account the masking (hidden surfaces) between objects. By the way, this library supplies other thermal modules such as a thermal shows computation tool.

  2. Structural Composites Corrosive Management by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  3. Enabling computational technologies for subsurface simulations

    SciTech Connect

    Falgout, R D

    1999-02-22

    We collaborated with Environmental Programs to develop and apply advanced computational methodologies for simulating multiphase flow through heterogeneous porous media. The primary focus was on developing a fast accurate advection scheme using a new temporal subcycling technique and on the scalable and efficient solution of the nonlinear Richards' equation used to model two-phase (variably saturated) flow. The resulting algorithms can be orders-of-magnitude faster than existing methods. Our computational technologies were applied to the simulation of subsurface fluid flow and chemical transport in the context of two important applications: water resource management and groundwater remediation.

  4. Task simulation in computer-based training

    SciTech Connect

    Gardner, P.R.

    1988-02-01

    Westinghouse Hanford Company (WHC) makes extensive use of job-task simulations in company-developed computer-based training (CBT) courseware. This courseware is different from most others because it does not simulate process control machinery or other computer programs, instead the WHC Excerises model day-to-day tasks such as physical work preparations, progress, and incident handling. These Exercises provide a higher level of motivation and enable the testing of more complex patterns of behavior than those typically measured by multiple-choice and short questions. Examples from the WHC Radiation Safety and Crane Safety courses will be used as illustrations. 3 refs.

  5. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  6. Computer simulation in sport and industry.

    PubMed

    Hubbard, M

    1993-01-01

    The last several decades have brought decreases in the specific cost of computer memory and increases in processor throughput. As a result simulation has become correspondingly more important as a component of industrial design and as a method for the study of general biomechanics and sports techniques. This paper illustrates, by way of examples, several of the more important aspects of the application of computer simulation to dynamic problems. Topics include (1) the ideas of suitable model complexity and its tradeoff with interpretability; (2) the sequential and iterative nature of model building and the importance of experimental data in the modelling and validation process; (3) the essential role of user-friendly software and graphical interfaces in the interchange of information between simulation programs and the users; and 4) the role of computer simulation in learning feedback loops, both in the field and in the computer laboratory. Most industrial use of simulation is in the design process. A similar approach is equally valid in biomechanics and sport applications through the incorporation of design variables, which may be easily changed in the model experiment.

  7. Computer simulation of the threshold sensitivity determinations

    NASA Technical Reports Server (NTRS)

    Gayle, J. B.

    1974-01-01

    A computer simulation study was carried out to evaluate various methods for determining threshold stimulus levels for impact sensitivity tests. In addition, the influence of a number of variables (initial stimulus level, particular stimulus response curve, and increment size) on the apparent threshold values and on the corresponding population response levels was determined. Finally, a critical review of previous assumptions regarding the stimulus response curve for impact testing is presented in the light of the simulation results.

  8. Computation Simulation Of Autonomous Vehicle Navigation

    NASA Astrophysics Data System (ADS)

    Meystel, A.; Koch, E.

    1984-06-01

    A concept of navigation is simulated based upon heuristic search. A mobile robot with a vision system navigates with an unknown or an unclear map. The range of vision is limited, thus, inflicting various judgments concerned with the comparison of alternatives of motion. The frequency of the decision-making procedure is limited by a definite time of computation. The system is simulated with a number of maps and the results of navigation are compared.

  9. Reproducible In-Silico Folding of a Four Helix 60 Amino Acid Protein in a Transferable All-Atom Forcefield

    NASA Astrophysics Data System (ADS)

    Schug, Alexander

    2005-03-01

    For predicting the protein tertiary structure one approach describes the native state of a protein as the global minimum of an appropiate free-energy forcefield. We have recently developed such a all-atom protein forcefield (PFF01). As major challenge remains the search for the global minimum for which we developed efficient methods. Using these we were able to predict the structure of helical proteins from different families ranging in size from 20 to 60 amino acids starting with random configurations. For the four helix 60 amino acid protein Bacterial Ribosomal Protein L20 (pdb code: 1GYZ) we used a simple client-master model for distributed computing. Starting from a set of random structures three phases of different folding simulations refined this set to a final one with 50 configurations. During this process the amount of native-like structures increased strongly. Six out of the ten structures best in energy approached the native structure within 5 åbackbone rmsd. The conformation with the lowest energy had a backbone rmsd value of 4.6 åtherefore correctly predicting the tertiary structure of 1GYZ.ReferencesA. Schug et al, Phys. Rev. Letters, 91:158102, 2003A. Schug et al, J. Am. Chem. Soc. (in press), 2004

  10. Perspective: Computer simulations of long time dynamics

    PubMed Central

    Elber, Ron

    2016-01-01

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

  11. Uncertainty and error in computational simulations

    SciTech Connect

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  12. Quantitative computer simulations of extraterrestrial processing operations

    NASA Technical Reports Server (NTRS)

    Vincent, T. L.; Nikravesh, P. E.

    1989-01-01

    The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

  13. Progress in Computational Simulation of Earthquakes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

    2006-01-01

    GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

  14. A School Finance Computer Simulation Model

    ERIC Educational Resources Information Center

    Boardman, Gerald R.

    1974-01-01

    Presents a description of the computer simulation model developed by the National Educational Finance Project for use by States in planning and evaluating alternative approaches for State support programs. Provides a general introduction to the model, a program operation overview, a sample run, and some conclusions. (Author/WM)

  15. Factors Promoting Engaged Exploration with Computer Simulations

    ERIC Educational Resources Information Center

    Podolefsky, Noah S.; Perkins, Katherine K.; Adams, Wendy K.

    2010-01-01

    This paper extends prior research on student use of computer simulations (sims) to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze…

  16. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  17. Decision Making in Computer-Simulated Experiments.

    ERIC Educational Resources Information Center

    Suits, J. P.; Lagowski, J. J.

    A set of interactive, computer-simulated experiments was designed to respond to the large range of individual differences in aptitude and reasoning ability generally exhibited by students enrolled in first-semester general chemistry. These experiments give students direct experience in the type of decision making needed in an experimental setting.…

  18. GENMAP--A Microbial Genetics Computer Simulation.

    ERIC Educational Resources Information Center

    Day, M. J.; And Others

    1985-01-01

    An interactive computer program in microbial genetics is described. The simulation allows students to work at their own pace and develop understanding of microbial techniques as they choose donor bacterial strains, specify selective media, and interact with demonstration experiments. Sample questions and outputs are included. (DH)

  19. Designing Online Scaffolds for Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

    2013-01-01

    The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high…

  20. Spiking network simulation code for petascale computers

    PubMed Central

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  1. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  2. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.

  3. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  4. Two Computer Simulations for Astronomy Education

    NASA Astrophysics Data System (ADS)

    Stoner, Ronald

    1997-05-01

    Two-dimensional media, such as transparencies and textbook illustrations are often inadequate for representing three-dimensional phenomena. Computer simulation using animation and interactive graphics can solve the pedagogic problem of allowing students to visualize inherently 3-D phenomena in physics and astronomy. This paper demonstrates two such computer simulations intended for use in introductory astronomy courses. The first permits visualization of astronomical structures on several different size scales by converting catalogs of astronomical objects at known distances (stars, star clusters, galaxies, etc.) to 3-D arrays of color-coded points that can be rotated in simulation to reveal 3-D structure. The second simulates the apparent motion of the sun in the sky of an arbitrary planet, simultaneously with the combined rotational and orbital motion of the planet that is responsible for it. These simulations were written in Borland Pascal for MS-DOS computers using the utilities package distributed with CUPS software (Educational software packages produced by the Consortium on Upper-level Physics Software (CUPS) are available from John Wiley & Sons, Inc.).

  5. Computational algorithms for simulations in atmospheric optics.

    PubMed

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

  6. Computational algorithms for simulations in atmospheric optics.

    PubMed

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors. PMID:27140113

  7. Computer simulations of WIGWAM underwater experiment

    SciTech Connect

    Kamegai, Minao; White, J.W.

    1993-11-01

    We performed computer simulations of the WIGWAM underwater experiment with a 2-D hydro-code, CALE. First, we calculated the bubble pulse and the signal strength at the closest gauge in one-dimensional geometry. The calculation shows excellent agreement with the measured data. Next, we made two-dimensional simulations of WIGWAM applying the gravity over-pressure, and calculated the signals at three selected gauge locations where measurements were recorded. The computed peak pressures at those gauge locations come well within the 15% experimental error bars. The signal at the farthest gauge is of the order of 200 bars. This is significant, because at this pressure the CALE output can be linked to a hydro-acoustics computer program, NPE Code (Nonlinear Progressive Wave-equation Code), to analyze the long distance propagation of acoustical signals from the underwater explosions on a global scale.

  8. Cosmological Simulations on a Grid of Computers

    NASA Astrophysics Data System (ADS)

    Depardon, Benjamin; Caron, Eddy; Desprez, Frédéric; Blaizot, Jérémy; Courtois, Hélène

    2010-06-01

    The work presented in this paper aims at restricting the input parameter values of the semi-analytical model used in GALICS and MOMAF, so as to derive which parameters influence the most the results, e.g., star formation, feedback and halo recycling efficiencies, etc. Our approach is to proceed empirically: we run lots of simulations and derive the correct ranges of values. The computation time needed is so large, that we need to run on a grid of computers. Hence, we model GALICS and MOMAF execution time and output files size, and run the simulation using a grid middleware: DIET. All the complexity of accessing resources, scheduling simulations and managing data is harnessed by DIET and hidden behind a web portal accessible to the users.

  9. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2005-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  10. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    2004-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  11. High-performance computing MRI simulations.

    PubMed

    Stöcker, Tony; Vahedipour, Kaveh; Pflugfelder, Daniel; Shah, N Jon

    2010-07-01

    A new open-source software project is presented, JEMRIS, the Jülich Extensible MRI Simulator, which provides an MRI sequence development and simulation environment for the MRI community. The development was driven by the desire to achieve generality of simulated three-dimensional MRI experiments reflecting modern MRI systems hardware. The accompanying computational burden is overcome by means of parallel computing. Many aspects are covered that have not hitherto been simultaneously investigated in general MRI simulations such as parallel transmit and receive, important off-resonance effects, nonlinear gradients, and arbitrary spatiotemporal parameter variations at different levels. The latter can be used to simulate various types of motion, for instance. The JEMRIS user interface is very simple to use, but nevertheless it presents few limitations. MRI sequences with arbitrary waveforms and complex interdependent modules are modeled in a graphical user interface-based environment requiring no further programming. This manuscript describes the concepts, methods, and performance of the software. Examples of novel simulation results in active fields of MRI research are given.

  12. Predicting transcription factor specificity with all-atom models.

    PubMed

    Jamal Rahi, Sahand; Virnau, Peter; Mirny, Leonid A; Kardar, Mehran

    2008-11-01

    The binding of a transcription factor (TF) to a DNA operator site can initiate or repress the expression of a gene. Computational prediction of sites recognized by a TF has traditionally relied upon knowledge of several cognate sites, rather than an ab initio approach. Here, we examine the possibility of using structure-based energy calculations that require no knowledge of bound sites but rather start with the structure of a protein-DNA complex. We study the PurR Escherichia coli TF, and explore to which extent atomistic models of protein-DNA complexes can be used to distinguish between cognate and noncognate DNA sites. Particular emphasis is placed on systematic evaluation of this approach by comparing its performance with bioinformatic methods, by testing it against random decoys and sites of homologous TFs. We also examine a set of experimental mutations in both DNA and the protein. Using our explicit estimates of energy, we show that the specificity for PurR is dominated by direct protein-DNA interactions, and weakly influenced by bending of DNA.

  13. Computational Challenges in Nuclear Weapons Simulation

    SciTech Connect

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  14. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  15. Ionic velocities in an ionic liquid under high electric fields using all-atom and coarse-grained force field molecular dynamics

    NASA Astrophysics Data System (ADS)

    Daily, John W.; Micci, Michael M.

    2009-09-01

    Molecular dynamics has been used to estimate ionic velocities and electrical conductivity in the ionic liquid 1-ethyl-3-methylimidazolium/tetraflouroborate (EMIM-BF4). Both an all-atom and coarse grained force fields were explored. The simulations were carried out at high electric fields where one might expect the Wien effect to become important in conventional electrolytes and that effect is observed. While the original Wilson theory used to explain the Wien effect in conventional electrolytes does not work well for ionic liquids, a minor modification of the theory allowed it to be used to qualitatively describe the data. The two coarse-graining methods were noisier as expected, but result in a significant savings in computational cost.

  16. Memory interface simulator: A computer design aid

    NASA Technical Reports Server (NTRS)

    Taylor, D. S.; Williams, T.; Weatherbee, J. E.

    1972-01-01

    Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.

  17. Introduction to computational oral absorption simulation.

    PubMed

    Sugano, Kiyohiko

    2009-03-01

    Computational oral absorption simulation (COAS) is anticipated to be a powerful tool in improving the productivity of drug discovery and development. This article reviews the theories of pharmaceutical sciences that consist of COAS. Although most of these theories are classical, they are revisited from the context of modern drug discovery and development. The theories of solubility, diffusion, dissolution, precipitation, intestinal membrane permeation and gastrointestinal transit are comprehensively described. Prediction strategy is then discussed based on the biopharmaceutical classification system. In the final part, good simulation practice is proposed and many frequently asked questions answered.

  18. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  19. Computer Simulation for Emergency Incident Management

    SciTech Connect

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  20. Fiber Composite Sandwich Thermostructural Behavior - Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Aiello, Robert A.; Murthy, Pappu L. N.

    1988-01-01

    Four computational simulation methods with different levels of sophistication were used to simulate thermal behavior and structural changes of composite sandwich panels with a honeycomb core subjected to a variety of environmental effects. The models on thich these methods are based include three-dimensional finite-element modeling, three-dimensional finite-element modeling assuming a homogeneous core, laminate theory, and simple equations for predicting the equivalent properties of the honeycomb core. A procedure was developed and embedded in a composite mechanics computer code, which made it possile to conduct parametric studies to determine 'optimum' composite sandwich configurations for specific applications. The procedure was applied for the evaluation of composite sandwich behavior at the global, local, laminate, ply, and micromechanics levels when the composite sandwich is subjected to hygral, thermal, and mechanical loading environments.

  1. Metal matrix composites microfracture: Computational simulation

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Caruso, John J.; Chamis, Christos C.

    1990-01-01

    Fiber/matrix fracture and fiber-matrix interface debonding in a metal matrix composite (MMC) are computationally simulated. These simulations are part of a research activity to develop computational methods for microfracture, microfracture propagation and fracture toughness of the metal matrix composites. The three-dimensional finite element model used in the simulation consists of a group of nine unidirectional fibers in three by three unit cell array of SiC/Ti15 metal matrix composite with a fiber volume ration of 0.35. This computational procedure is used to predict the fracture process and establish the hierarchy of fracture modes based on strain energy release rate. It is also used to predict stress redistribution to surrounding matrix-fibers due to initial and progressive fracture of fiber/matrix and due to debonding of fiber-matrix interface. Microfracture results for various loading cases such as longitudinal, transverse, shear and bending are presented and discussed. Step-by-step procedures are outlined to evaluate composite microfracture for a given composite system.

  2. Integrated computer simulation on FIR FEL dynamics

    SciTech Connect

    Furukawa, H.; Kuruma, S.; Imasaki, K.

    1995-12-31

    An integrated computer simulation code has been developed to analyze the RF-Linac FEL dynamics. First, the simulation code on the electron beam acceleration and transport processes in RF-Linac: (LUNA) has been developed to analyze the characteristics of the electron beam in RF-Linac and to optimize the parameters of RF-Linac. Second, a space-time dependent 3D FEL simulation code (Shipout) has been developed. The RF-Linac FEL total simulations have been performed by using the electron beam data from LUNA in Shipout. The number of particles using in a RF-Linac FEL total simulation is approximately 1000. The CPU time for the simulation of 1 round trip is about 1.5 minutes. At ILT/ILE, Osaka, a 8.5MeV RF-Linac with a photo-cathode RF-gun is used for FEL oscillation experiments. By using 2 cm wiggler, the FEL oscillation in the wavelength approximately 46 {mu}m are investigated. By the simulations using LUNA with the parameters of an ILT/ILE experiment, the pulse shape and the energy spectra of the electron beam at the end of the linac are estimated. The pulse shape of the electron beam at the end of the linac has sharp rise-up and it slowly decays as a function of time. By the RF-linac FEL total simulations with the parameters of an ILT/ILE experiment, the dependencies of the start up of the FEL oscillations on the pulse shape of the electron beam at the end of the linac are estimated. The coherent spontaneous emission effects and the quick start up of FEL oscillations have been observed by the RF-Linac FEL total simulations.

  3. Accelerating Climate Simulations Through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  4. Bridging between NMA and Elastic Network Models: Preserving All-Atom Accuracy in Coarse-Grained Models.

    PubMed

    Na, Hyuntae; Jernigan, Robert L; Song, Guang

    2015-10-01

    Dynamics can provide deep insights into the functional mechanisms of proteins and protein complexes. For large protein complexes such as GroEL/GroES with more than 8,000 residues, obtaining a fine-grained all-atom description of its normal mode motions can be computationally prohibitive and is often unnecessary. For this reason, coarse-grained models have been used successfully. However, most existing coarse-grained models use extremely simple potentials to represent the interactions within the coarse-grained structures and as a result, the dynamics obtained for the coarse-grained structures may not always be fully realistic. There is a gap between the quality of the dynamics of the coarse-grained structures given by all-atom models and that by coarse-grained models. In this work, we resolve an important question in protein dynamics computations--how can we efficiently construct coarse-grained models whose description of the dynamics of the coarse-grained structures remains as accurate as that given by all-atom models? Our method takes advantage of the sparseness of the Hessian matrix and achieves a high efficiency with a novel iterative matrix projection approach. The result is highly significant since it can provide descriptions of normal mode motions at an all-atom level of accuracy even for the largest biomolecular complexes. The application of our method to GroEL/GroES offers new insights into the mechanism of this biologically important chaperonin, such as that the conformational transitions of this protein complex in its functional cycle are even more strongly connected to the first few lowest frequency modes than with other coarse-grained models.

  5. Neural network computer simulation of medical aerosols.

    PubMed

    Richardson, C J; Barlow, D J

    1996-06-01

    Preliminary investigations have been conducted to assess the potential for using artificial neural networks to simulate aerosol behaviour, with a view to employing this type of methodology in the evaluation and design of pulmonary drug-delivery systems. Details are presented of the general purpose software developed for these tasks; it implements a feed-forward back-propagation algorithm with weight decay and connection pruning, the user having complete run-time control of the network architecture and mode of training. A series of exploratory investigations is then reported in which different network structures and training strategies are assessed in terms of their ability to simulate known patterns of fluid flow in simple model systems. The first of these involves simulations of cellular automata-generated data for fluid flow through a partially obstructed two-dimensional pipe. The artificial neural networks are shown to be highly successful in simulating the behaviour of this simple linear system, but with important provisos relating to the information content of the training data and the criteria used to judge when the network is properly trained. A second set of investigations is then reported in which similar networks are used to simulate patterns of fluid flow through aerosol generation devices, using training data furnished through rigorous computational fluid dynamics modelling. These more complex three-dimensional systems are modelled with equal success. It is concluded that carefully tailored, well trained networks could provide valuable tools not just for predicting but also for analysing the spatial dynamics of pharmaceutical aerosols.

  6. Utility of computer simulations in landscape genetics.

    PubMed

    Epperson, Bryan K; McRae, Brad H; Scribner, Kim; Cushman, Samuel A; Rosenberg, Michael S; Fortin, Marie-Josée; James, Patrick M A; Murphy, Melanie; Manel, Stéphanie; Legendre, Pierre; Dale, Mark R T

    2010-09-01

    Population genetics theory is primarily based on mathematical models in which spatial complexity and temporal variability are largely ignored. In contrast, the field of landscape genetics expressly focuses on how population genetic processes are affected by complex spatial and temporal environmental heterogeneity. It is spatially explicit and relates patterns to processes by combining complex and realistic life histories, behaviours, landscape features and genetic data. Central to landscape genetics is the connection of spatial patterns of genetic variation to the usually highly stochastic space-time processes that create them over both historical and contemporary time periods. The field should benefit from a shift to computer simulation approaches, which enable incorporation of demographic and environmental stochasticity. A key role of simulations is to show how demographic processes such as dispersal or reproduction interact with landscape features to affect probability of site occupancy, population size, and gene flow, which in turn determine spatial genetic structure. Simulations could also be used to compare various statistical methods and determine which have correct type I error or the highest statistical power to correctly identify spatio-temporal and environmental effects. Simulations may also help in evaluating how specific spatial metrics may be used to project future genetic trends. This article summarizes some of the fundamental aspects of spatial-temporal population genetic processes. It discusses the potential use of simulations to determine how various spatial metrics can be rigorously employed to identify features of interest, including contrasting locus-specific spatial patterns due to micro-scale environmental selection.

  7. A computer simulation of chromosomal instability

    NASA Astrophysics Data System (ADS)

    Goodwin, E.; Cornforth, M.

    The transformation of a normal cell into a cancerous growth can be described as a process of mutation and selection occurring within the context of clonal expansion. Radiation, in addition to initial DNA damage, induces a persistent and still poorly understood genomic instability process that contributes to the mutational burden. It will be essential to include a quantitative description of this phenomenon in any attempt at science-based risk assessment. Monte Carlo computer simulations are a relatively simple way to model processes that are characterized by an element of randomness. A properly constructed simulation can capture the essence of a phenomenon that, as is often the case in biology, can be extraordinarily complex, and can do so even though the phenomenon itself is incompletely understood. A simple computer simulation of one manifestation of genomic instability known as chromosomal instability will be presented. The model simulates clonal expansion of a single chromosomally unstable cell into a colony. Instability is characterized by a single parameter, the rate of chromosomal rearrangement. With each new chromosome aberration, a unique subclone arises (subclones are defined as having a unique karyotype). The subclone initially has just one cell, but it can expand with cell division if the aberration is not lethal. The computer program automatically keeps track of the number of subclones within the expanding colony, and the number of cells within each subclone. Because chromosome aberrations kill some cells during colony growth, colonies arising from unstable cells tend to be smaller than those arising from stable cells. For any chosen level of instability, the computer program calculates the mean number of cells per colony averaged over many runs. These output should prove useful for investigating how such radiobiological phenomena as slow growth colonies, increased doubling time, and delayed cell death depend on chromosomal instability. Also of

  8. New Computer Simulations of Macular Neural Functioning

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  9. Computer simulations of the Ni2MnGa alloys

    NASA Astrophysics Data System (ADS)

    Breczko, Teodor M.; Nelayev, Vladislav; Dovzhik, Krishna; Najbuk, Miroslaw

    2008-07-01

    This article reports an computer simulations of physical properties of Heusler NiMnGa alloy. Computer simulation are devoted to austenite phase. The chemical composition of researched specimens causes generation martesite and austenite phases.

  10. Investigation of Carbohydrate Recognition via Computer Simulation

    SciTech Connect

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  11. Parallel Proximity Detection for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1998-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  12. Parallel Proximity Detection for Computer Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1997-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  13. Investigation of Carbohydrate Recognition via Computer Simulation.

    PubMed

    Johnson, Quentin R; Lindsay, Richard J; Petridis, Loukas; Shen, Tongye

    2015-01-01

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Recently, interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. We focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years. PMID:25927900

  14. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE PAGES

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  15. Investigation of Carbohydrate Recognition via Computer Simulation.

    PubMed

    Johnson, Quentin R; Lindsay, Richard J; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Recently, interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. We focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  16. Computer simulation of spacecraft/environment interaction.

    PubMed

    Krupnikov, K K; Makletsov, A A; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-10-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991 1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language. PMID:11542669

  17. Computer simulation of spacecraft/environment interaction.

    PubMed

    Krupnikov, K K; Makletsov, A A; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-10-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991 1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  18. Multidimensional computer simulation of Stirling cycle engines

    NASA Technical Reports Server (NTRS)

    Hall, C. A.; Porsching, T. A.; Medley, J.; Tew, R. C.

    1990-01-01

    The computer code ALGAE (algorithms for the gas equations) treats incompressible, thermally expandable, or locally compressible flows in complicated two-dimensional flow regions. The solution method, finite differencing schemes, and basic modeling of the field equations in ALGAE are applicable to engineering design settings of the type found in Stirling cycle engines. The use of ALGAE to model multiple components of the space power research engine (SPRE) is reported. Videotape computer simulations of the transient behavior of the working gas (helium) in the heater-regenerator-cooler complex of the SPRE demonstrate the usefulness of such a program in providing information on thermal and hydraulic phenomena in multiple component sections of the SPRE.

  19. Multiscale approach for the construction of equilibrated all-atom models of a poly(ethylene glycol)-based hydrogel.

    PubMed

    Li, Xianfeng; Murthy, N Sanjeeva; Becker, Matthew L; Latour, Robert A

    2016-06-24

    A multiscale modeling approach is presented for the efficient construction of an equilibrated all-atom model of a cross-linked poly(ethylene glycol) (PEG)-based hydrogel using the all-atom polymer consistent force field (PCFF). The final equilibrated all-atom model was built with a systematic simulation toolset consisting of three consecutive parts: (1) building a global cross-linked PEG-chain network at experimentally determined cross-link density using an on-lattice Monte Carlo method based on the bond fluctuation model, (2) recovering the local molecular structure of the network by transitioning from the lattice model to an off-lattice coarse-grained (CG) model parameterized from PCFF, followed by equilibration using high performance molecular dynamics methods, and (3) recovering the atomistic structure of the network by reverse mapping from the equilibrated CG structure, hydrating the structure with explicitly represented water, followed by final equilibration using PCFF parameterization. The developed three-stage modeling approach has application to a wide range of other complex macromolecular hydrogel systems, including the integration of peptide, protein, and/or drug molecules as side-chains within the hydrogel network for the incorporation of bioactivity for tissue engineering, regenerative medicine, and drug delivery applications.

  20. Bridging between NMA and Elastic Network Models: Preserving All-Atom Accuracy in Coarse-Grained Models

    PubMed Central

    Na, Hyuntae; Jernigan, Robert L.; Song, Guang

    2015-01-01

    Dynamics can provide deep insights into the functional mechanisms of proteins and protein complexes. For large protein complexes such as GroEL/GroES with more than 8,000 residues, obtaining a fine-grained all-atom description of its normal mode motions can be computationally prohibitive and is often unnecessary. For this reason, coarse-grained models have been used successfully. However, most existing coarse-grained models use extremely simple potentials to represent the interactions within the coarse-grained structures and as a result, the dynamics obtained for the coarse-grained structures may not always be fully realistic. There is a gap between the quality of the dynamics of the coarse-grained structures given by all-atom models and that by coarse-grained models. In this work, we resolve an important question in protein dynamics computations—how can we efficiently construct coarse-grained models whose description of the dynamics of the coarse-grained structures remains as accurate as that given by all-atom models? Our method takes advantage of the sparseness of the Hessian matrix and achieves a high efficiency with a novel iterative matrix projection approach. The result is highly significant since it can provide descriptions of normal mode motions at an all-atom level of accuracy even for the largest biomolecular complexes. The application of our method to GroEL/GroES offers new insights into the mechanism of this biologically important chaperonin, such as that the conformational transitions of this protein complex in its functional cycle are even more strongly connected to the first few lowest frequency modes than with other coarse-grained models. PMID:26473491

  1. Computer Simulation Studies of Gramicidin Channel

    NASA Astrophysics Data System (ADS)

    Song, Hyundeok; Beck, Thomas

    2009-04-01

    Ion channels are large membrane proteins, and their function is to facilitate the passage of ions across biological membranes. Recently, Dr. John Cuppoletti's group at UC showed that the gramicidin channel could function at high temperatures (360 -- 390K) with significant currents. This finding may have large implications for fuel cell technology. In this project, we will examine the experimental system by computer simulation. We will investigate how the temperature affects the current and differences in magnitude of the currents between two forms of Gramicidin, A and D. This research will help to elucidate the underlying molecular mechanism in this promising new technology.

  2. Computer simulation of solder joint failure

    SciTech Connect

    Burchett, S.N.; Frear, D.R.; Rashid, M.M.

    1997-04-01

    The thermomechanical fatigue failure of solder joints is increasingly becoming an important reliability issue for electronic packages. The purpose of this Laboratory Directed Research and Development (LDRD) project was to develop computational tools for simulating the behavior of solder joints under strain and temperature cycling, taking into account the microstructural heterogeneities that exist in as-solidified near eutectic Sn-Pb joints, as well as subsequent microstructural evolution. The authors present two computational constitutive models, a two-phase model and a single-phase model, that were developed to predict the behavior of near eutectic Sn-Pb solder joints under fatigue conditions. Unique metallurgical tests provide the fundamental input for the constitutive relations. The two-phase model mathematically predicts the heterogeneous coarsening behavior of near eutectic Sn-Pb solder. The finite element simulations with this model agree qualitatively with experimental thermomechanical fatigue tests. The simulations show that the presence of an initial heterogeneity in the solder microstructure could significantly degrade the fatigue lifetime. The single-phase model was developed to predict solder joint behavior using materials data for constitutive relation constants that could be determined through straightforward metallurgical experiments. Special thermomechanical fatigue tests were developed to give fundamental materials input to the models, and an in situ SEM thermomechanical fatigue test system was developed to characterize microstructural evolution and the mechanical behavior of solder joints during the test. A shear/torsion test sample was developed to impose strain in two different orientations. Materials constants were derived from these tests. The simulation results from the two-phase model showed good fit to the experimental test results.

  3. Computer Simulation of Fracture in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2006-01-01

    Aerogels are of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While the gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. In this work, we investigate the strength and fracture behavior of silica aerogels using a molecular statics-based computer simulation technique. The gels' structure is simulated via a Diffusion Limited Cluster Aggregation (DLCA) algorithm, which produces fractal structures representing experimentally observed aggregates of so-called secondary particles, themselves composed of amorphous silica primary particles an order of magnitude smaller. We have performed multi-length-scale simulations of fracture in silica aerogels, in which the interaction b e e n two secondary particles is assumed to be described by a Morse pair potential parameterized such that the potential range is much smaller than the secondary particle size. These Morse parameters are obtained by atomistic simulation of models of the experimentally-observed amorphous silica "bridges," with the fracture behavior of these bridges modeled via molecular statics using a Morse/Coulomb potential for silica. We consider the energetics of the fracture, and compare qualitative features of low-and high-density gel fracture.

  4. Multiscale Computer Simulation of Failure in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  5. Mapping lava flow hazards using computer simulation

    NASA Astrophysics Data System (ADS)

    Wadge, G.; Young, P. A. V.; McKendrick, I. J.

    1994-01-01

    Computer simulations of the paths of flowing lava are achieved using a program, FLOWFRONT, that describes the behavior of flow and digital models of the terrain. Two methods of application of simulations of the hazards posed by lava flows are described. The first, deterministic, method requires that program parameters such as vent position, minimum flow thickness, and thickness/slope relationship be based on the ambient eruptive conditions so that the future course of a specific lava flow can be simulated. This is illustrated using retrospective modeling of the first 21 days of the eruption of an andesitic lava flow at Lonquimay volcano, Chile, in 1988-1989. The usefulness of this method for real-time predictive modeling is likely to be limited by the lack of accurate field data on flow characteristics, the simple nature of the model, and the sensitivity to parameter choice of the final planimetric form of the model flow. The second application is probabilistic in nature and creates a map of the likelihood of inundation by lava flows that is useful for long-term land use planning. This method uses the historical record of past eruptions to constrain a series of Monte Carlo simulations and is illustrated using data from Etna volcano in Sicily. A multivariate statistical analysis of nine parameters for the 1763-1989 eruption catalog using simulated annealing permitted a classification of Etna's flank eruptions into two types: A and B. Type A eruptions are short-lived and produce linear lava flows; type B eruptions are long-lived, and produce lava flows that are much broader in shape, and their vents are restricted to the eastern flank of the volcano. The simulation method consists of creating a probability surface of the location of future eruption vents and segmenting the region according to the most likely historical eruption on which to base the simulation. Analysis of the autocorrelation of the historical eruptions shows that type A eruptions are strongly

  6. Computer Simulations in Science Education: Implications for Distance Education

    ERIC Educational Resources Information Center

    Sahin, Sami

    2006-01-01

    This paper is a review of literature about the use of computer simulations in science education. This review examines types and examples of computer simulations. The literature review indicated that although computer simulations cannot replace science classroom and laboratory activities completely, they offer various advantages both for classroom…

  7. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  8. Computational simulation of liquid fuel rocket injectors

    NASA Technical Reports Server (NTRS)

    Landrum, D. Brian

    1994-01-01

    A major component of any liquid propellant rocket is the propellant injection system. Issues of interest include the degree of liquid vaporization and its impact on the combustion process, the pressure and temperature fields in the combustion chamber, and the cooling of the injector face and chamber walls. The Finite Difference Navier-Stokes (FDNS) code is a primary computational tool used in the MSFC Computational Fluid Dynamics Branch. The branch has dedicated a significant amount of resources to development of this code for prediction of both liquid and solid fuel rocket performance. The FDNS code is currently being upgraded to include the capability to model liquid/gas multi-phase flows for fuel injection simulation. An important aspect of this effort is benchmarking the code capabilities to predict existing experimental injection data. The objective of this MSFC/ASEE Summer Faculty Fellowship term was to evaluate the capabilities of the modified FDNS code to predict flow fields with liquid injection. Comparisons were made between code predictions and existing experimental data. A significant portion of the effort included a search for appropriate validation data. Also, code simulation deficiencies were identified.

  9. Preformed template fluctuations promote fibril formation: Insights from lattice and all-atom models

    NASA Astrophysics Data System (ADS)

    Kouza, Maksim; Co, Nguyen Truong; Nguyen, Phuong H.; Kolinski, Andrzej; Li, Mai Suan

    2015-04-01

    Fibril formation resulting from protein misfolding and aggregation is a hallmark of several neurodegenerative diseases such as Alzheimer's and Parkinson's diseases. Despite the fact that the fibril formation process is very slow and thus poses a significant challenge for theoretical and experimental studies, a number of alternative pictures of molecular mechanisms of amyloid fibril formation have been recently proposed. What seems to be common for the majority of the proposed models is that fibril elongation involves the formation of pre-nucleus seeds prior to the creation of a critical nucleus. Once the size of the pre-nucleus seed reaches the critical nucleus size, its thermal fluctuations are expected to be small and the resulting nucleus provides a template for sequential (one-by-one) accommodation of added monomers. The effect of template fluctuations on fibril formation rates has not been explored either experimentally or theoretically so far. In this paper, we make the first attempt at solving this problem by two sets of simulations. To mimic small template fluctuations, in one set, monomers of the preformed template are kept fixed, while in the other set they are allowed to fluctuate. The kinetics of addition of a new peptide onto the template is explored using all-atom simulations with explicit water and the GROMOS96 43a1 force field and simple lattice models. Our result demonstrates that preformed template fluctuations can modulate protein aggregation rates and pathways. The association of a nascent monomer with the template obeys the kinetics partitioning mechanism where the intermediate state occurs in a fraction of routes to the protofibril. It was shown that template immobility greatly increases the time of incorporating a new peptide into the preformed template compared to the fluctuating template case. This observation has also been confirmed by simulation using lattice models and may be invoked to understand the role of template fluctuations in

  10. Preformed template fluctuations promote fibril formation: Insights from lattice and all-atom models

    SciTech Connect

    Kouza, Maksim Kolinski, Andrzej; Co, Nguyen Truong; Nguyen, Phuong H.; Li, Mai Suan

    2015-04-14

    Fibril formation resulting from protein misfolding and aggregation is a hallmark of several neurodegenerative diseases such as Alzheimer’s and Parkinson’s diseases. Despite the fact that the fibril formation process is very slow and thus poses a significant challenge for theoretical and experimental studies, a number of alternative pictures of molecular mechanisms of amyloid fibril formation have been recently proposed. What seems to be common for the majority of the proposed models is that fibril elongation involves the formation of pre-nucleus seeds prior to the creation of a critical nucleus. Once the size of the pre-nucleus seed reaches the critical nucleus size, its thermal fluctuations are expected to be small and the resulting nucleus provides a template for sequential (one-by-one) accommodation of added monomers. The effect of template fluctuations on fibril formation rates has not been explored either experimentally or theoretically so far. In this paper, we make the first attempt at solving this problem by two sets of simulations. To mimic small template fluctuations, in one set, monomers of the preformed template are kept fixed, while in the other set they are allowed to fluctuate. The kinetics of addition of a new peptide onto the template is explored using all-atom simulations with explicit water and the GROMOS96 43a1 force field and simple lattice models. Our result demonstrates that preformed template fluctuations can modulate protein aggregation rates and pathways. The association of a nascent monomer with the template obeys the kinetics partitioning mechanism where the intermediate state occurs in a fraction of routes to the protofibril. It was shown that template immobility greatly increases the time of incorporating a new peptide into the preformed template compared to the fluctuating template case. This observation has also been confirmed by simulation using lattice models and may be invoked to understand the role of template fluctuations in

  11. Reduction of All-Atom Protein Folding Dynamics to One-Dimensional Diffusion.

    PubMed

    Zheng, Wenwei; Best, Robert B

    2015-12-10

    Theoretical models have often modeled protein folding dynamics as diffusion on a low-dimensional free energy surface, a remarkable simplification. However, the accuracy of such an approximation and the number of dimensions required were not clear. For all-atom folding simulations of ten small proteins in explicit solvent we show that the folding dynamics can indeed be accurately described as diffusion on just a single coordinate, the fraction of native contacts (Q). The diffusion models reproduce both folding rates, and finer details such as transition-path durations and diffusive propagators. The Q-averaged diffusion coefficients decrease with chain length, as anticipated from energy landscape theory. Although the Q-diffusion model does not capture transition-path durations for the protein NuG2, we show that this can be accomplished by designing an improved coordinate Qopt. Overall, one-dimensional diffusion on a suitable coordinate turns out to be a remarkably faithful model for the dynamics of the proteins considered.

  12. All-atom molecular dynamics calculation study of entire poliovirus empty capsids in solution

    SciTech Connect

    Andoh, Y.; Yoshii, N.; Yamada, A.; Kojima, H.; Mizutani, K.; Okazaki, S.; Fujimoto, K.; Nakagawa, A.; Nomoto, A.

    2014-10-28

    Small viruses that belong, for example, to the Picornaviridae, such as poliovirus and foot-and-mouth disease virus, consist simply of capsid proteins and a single-stranded RNA (ssRNA) genome. The capsids are quite stable in solution to protect the genome from the environment. Here, based on long-time and large-scale 6.5 × 10{sup 6} all-atom molecular dynamics calculations for the Mahoney strain of poliovirus, we show microscopic properties of the viral capsids at a molecular level. First, we found equilibrium rapid exchange of water molecules across the capsid. The exchange rate is so high that all water molecules inside the capsid (about 200 000) can leave the capsid and be replaced by water molecules from the outside in about 25 μs. This explains the capsid's tolerance to high pressures and deactivation by exsiccation. In contrast, the capsid did not exchange ions, at least within the present simulation time of 200 ns. This implies that the capsid can function, in principle, as a semipermeable membrane. We also found that, similar to the xylem of trees, the pressure of the solution inside the capsid without the genome was negative. This is caused by coulombic interaction of the solution inside the capsid with the capsid excess charges. The negative pressure may be compensated by positive osmotic pressure by the solution-soluble ssRNA and the counter ions introduced into it.

  13. Refined OPLS all-atom force field for saturated phosphatidylcholine bilayers at full hydration.

    PubMed

    Maciejewski, Arkadiusz; Pasenkiewicz-Gierula, Marta; Cramariuc, Oana; Vattulainen, Ilpo; Rog, Tomasz

    2014-05-01

    We report parametrization of dipalmitoyl-phosphatidylcholine (DPPC) in the framework of the Optimized Parameters for Liquid Simulations all-atom (OPLS-AA) force field. We chose DPPC as it is one of the most studied phospholipid species and thus has plenty of experimental data necessary for model validation, and it is also one of the highly important and abundant lipid types, e.g., in lung surfactant. Overall, PCs have not been previously parametrized in the OPLS-AA force field; thus, there is a need to derive its bonding and nonbonding parameters for both the polar and nonpolar parts of the molecule. In the present study, we determined the parameters for torsion angles in the phosphatidylcholine and glycerol moieties and in the acyl chains, as well the partial atomic charges. In these calculations, we used three methods: (1) Hartree-Fock (HF), (2) second order Møller-Plesset perturbation theory (MP2), and (3) density functional theory (DFT). We also tested the effect of the polar environment by using the polarizable continuum model (PCM), and for acyl chains the van der Waals parameters were also adjusted. In effect, six parameter sets were generated and tested on a DPPC bilayer. Out of these six sets, only one was found to be able to satisfactorily reproduce experimental data for the lipid bilayer. The successful DPPC model was obtained from MP2 calculations in an implicit polar environment (PCM). PMID:24745688

  14. All-atom molecular dynamics calculation study of entire poliovirus empty capsids in solution

    NASA Astrophysics Data System (ADS)

    Andoh, Y.; Yoshii, N.; Yamada, A.; Fujimoto, K.; Kojima, H.; Mizutani, K.; Nakagawa, A.; Nomoto, A.; Okazaki, S.

    2014-10-01

    Small viruses that belong, for example, to the Picornaviridae, such as poliovirus and foot-and-mouth disease virus, consist simply of capsid proteins and a single-stranded RNA (ssRNA) genome. The capsids are quite stable in solution to protect the genome from the environment. Here, based on long-time and large-scale 6.5 × 106 all-atom molecular dynamics calculations for the Mahoney strain of poliovirus, we show microscopic properties of the viral capsids at a molecular level. First, we found equilibrium rapid exchange of water molecules across the capsid. The exchange rate is so high that all water molecules inside the capsid (about 200 000) can leave the capsid and be replaced by water molecules from the outside in about 25 μs. This explains the capsid's tolerance to high pressures and deactivation by exsiccation. In contrast, the capsid did not exchange ions, at least within the present simulation time of 200 ns. This implies that the capsid can function, in principle, as a semipermeable membrane. We also found that, similar to the xylem of trees, the pressure of the solution inside the capsid without the genome was negative. This is caused by coulombic interaction of the solution inside the capsid with the capsid excess charges. The negative pressure may be compensated by positive osmotic pressure by the solution-soluble ssRNA and the counter ions introduced into it.

  15. Refined OPLS all-atom force field for saturated phosphatidylcholine bilayers at full hydration.

    PubMed

    Maciejewski, Arkadiusz; Pasenkiewicz-Gierula, Marta; Cramariuc, Oana; Vattulainen, Ilpo; Rog, Tomasz

    2014-05-01

    We report parametrization of dipalmitoyl-phosphatidylcholine (DPPC) in the framework of the Optimized Parameters for Liquid Simulations all-atom (OPLS-AA) force field. We chose DPPC as it is one of the most studied phospholipid species and thus has plenty of experimental data necessary for model validation, and it is also one of the highly important and abundant lipid types, e.g., in lung surfactant. Overall, PCs have not been previously parametrized in the OPLS-AA force field; thus, there is a need to derive its bonding and nonbonding parameters for both the polar and nonpolar parts of the molecule. In the present study, we determined the parameters for torsion angles in the phosphatidylcholine and glycerol moieties and in the acyl chains, as well the partial atomic charges. In these calculations, we used three methods: (1) Hartree-Fock (HF), (2) second order Møller-Plesset perturbation theory (MP2), and (3) density functional theory (DFT). We also tested the effect of the polar environment by using the polarizable continuum model (PCM), and for acyl chains the van der Waals parameters were also adjusted. In effect, six parameter sets were generated and tested on a DPPC bilayer. Out of these six sets, only one was found to be able to satisfactorily reproduce experimental data for the lipid bilayer. The successful DPPC model was obtained from MP2 calculations in an implicit polar environment (PCM).

  16. Mapping lava flow hazards using computer simulation

    SciTech Connect

    Wadge, G.; Young, P.A.V.; Mckendrick, I.J.

    1994-01-01

    Computer simulations of the paths of flowing lava are achieved using a program, FLOWFRONT, that describes the behavior of flow and digital models of the terrain. Two methods of application of simulations of the hazards posed by lava flows are described. The first, deterministic, method requires that program parameters such as vent position, minimum flow thickness, and thickness/slope relationship be based on the ambient eruptive conditions so that the future course of a specific lava flow can be simulated. This is illustrated using retrospective modeling of the first 21 days of the eruption of an andesitic lava flow at Lonquimay volcano, Chile, in 1988-1989. The usefulness of this method for real-time predictive modeling is likely to be limited by the lack of accurate field data on flow characteristics, the simple nature of the model, and the sensitivity to parameter choice of the final planimetric form of the model flow. The second application is probabilistic in nature and creates a map of the likelihood of inundation by lava flows that is useful for long-term land use planning. This method uses the historical record of past eruptions to constrain a series of Monte Carlo simulations and is illustrated using data from Etna volcano in Sicily. A multivariate statistical analysis of nine parameters for the 1763-1989 eruption catalog using simulated annealing permitted a classification of Etna`s flank eruptions into two types: A and B. Type A eruptions are short-lived and produce linear lava flows; type B eruptions are long-lived, and produce lava flows that are much broader in shape, and their vents are restricted to the eastern flank of the volcano.

  17. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  18. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  19. Infrared Flight Simulation Using Computer Generated Imagery

    NASA Astrophysics Data System (ADS)

    Weathersby, Marshall R.; Finlay, W. Mark

    1985-01-01

    A sophisticated deterministic interactive software model for computer generation of three-dimensionally projected infrared scenes has been developed. Scenes can be produced using either a self-emission or near infrared reflectance model. The software allows for generation of flight paths through a data base consisting of both feature and topography and near real-time display of stored precomputed images. The emphasis in the model development has been in computer generation of infrared scenes which accurately reproduce the characteristics of real-world imagery. The software combines computer graphics and infrared physics to produce synthetic scenes with the statistical properties of real scenes. Options exist for generation of images in near-infrared, 3-5 or 8-12 micron spectral bands including atmospheric attenuation effects. The three-dimensional projection algorithms allow for viewing of the scenes from any geometry and include concave and convex surfaces as well as hidden objects. Features exist for insertion of additional objects into the three-dimensional scenes. Thus targets, buildings, and other natural or man-made objects can be inserted with any orientation anywhere in the scenes. This allows full simulation of varying depression angles, range closure, and fly-over. The three-dimensional infrared background clutter model is an evaluation tool capable of both assessing system performance in clutter and increasing our understanding of clutter itself. The model in its current form represents a powerful tool for the fundamental understanding of infrared clutter. Possible applications include, but are most certainly not limited to, sensor operator training in the area of target discrimination with dynamic imagery, evaluation of automatic target recognizer (ATR) algorithms, and simulations allowing pilots to pre-fly missions.

  20. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1984-01-01

    All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.

  1. Computer simulation of fatigue under diametrical compression

    SciTech Connect

    Carmona, H. A.; Kun, F.; Andrade, J. S. Jr.; Herrmann, H. J.

    2007-04-15

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.

  2. The equilibrium properties and folding kinetics of an all-atom Go model of the Trp-cage.

    PubMed

    Linhananta, Apichart; Boer, Jesse; MacKay, Ian

    2005-03-15

    The ultrafast-folding 20-residue Trp-cage protein is quickly becoming a new benchmark for molecular dynamics studies. Already several all-atom simulations have probed its equilibrium and kinetic properties. In this work an all-atom Go model is used to accurately represent the side-chain packing and native atomic contacts of the Trp-cage. The model reproduces the hallmark thermodynamics cooperativity of small proteins. Folding simulations observe that in the fast-folding dominant pathway, partial alpha-helical structure forms before hydrophobic core collapse. In the slow-folding secondary pathway, partial core collapse occurs before helical structure. The slow-folding rate of the secondary pathway is attributed to the loss of side-chain rotational freedom, due to the early core collapse, which impedes the helix formation. A major finding is the observation of a low-temperature kinetic intermediate stabilized by a salt bridge between residues Asp-9 and Arg-16. Similar observations [R. Zhou, Proc. Natl. Acad. Sci. U.S.A. 100, 13280 (2003)] were reported in a recent study using an all-atom model of the Trp-cage in explicit water, in which the salt-bridge stabilized intermediate was hypothesized to be the origin of the ultrafast-folding mechanism. A theoretical mutation that eliminates the Asp-9-Arg-16 salt bridge, but leaves the residues intact, is performed. Folding simulations of the mutant Trp-cage observe a two-state free-energy landscape with no kinetic intermediate and a significant decrease in the folding rate, in support of the hypothesis.

  3. All-atom molecular dynamics studies of the full-length β-amyloid peptides

    NASA Astrophysics Data System (ADS)

    Luttmann, Edgar; Fels, Gregor

    2006-03-01

    β-Amyloid peptides are believed to play an essential role in Alzheimer's disease (AD), due to their sedimentation in the form of β-amyloid aggregates in the brain of AD-patients, and the in vitro neurotoxicity of oligomeric aggregates. The monomeric peptides come in different lengths of 39-43 residues, of which the 42 alloform seems to be most strongly associated with AD-symptoms. Structural information on these peptides to date comes from NMR studies in acidic solutions, organic solvents, or on shorter fragments of the peptide. In addition X-ray and solid-state NMR investigations of amyloid fibrils yield insight into the structure of the final aggregate and therefore define the endpoint of any conformational change of an Aβ-monomer along the aggregation process. The conformational changes necessary to connect the experimentally known conformations are not yet understood and this process is an active field of research. In this paper, we report results from all-atom molecular dynamics simulations based on experimental data from four different peptides of 40 amino acids and two peptides consisting of 42 amino acids. The simulations allow for the analysis of intramolecular interactions and the role of structural features. In particular, they show the appearance of β-turn in the region between amino acid 21 and 33, forming a hook-like shape as it is known to exist in the fibrillar Aβ-structures. This folding does not depend on the formation of a salt bridge between Asp-23 and Lys-28 but requires the Aβ(1-42) as such structure was not observed in the shorter system Aβ(1-40).

  4. A Mass Spectrometer Simulator in Your Computer

    NASA Astrophysics Data System (ADS)

    Gagnon, Michel

    2012-12-01

    Introduced to study components of ionized gas, the mass spectrometer has evolved into a highly accurate device now used in many undergraduate and research laboratories. Unfortunately, despite their importance in the formation of future scientists, mass spectrometers remain beyond the financial reach of many high schools and colleges. As a result, it is not possible for instructors to take full advantage of this equipment. Therefore, to facilitate accessibility to this tool, we have developed a realistic computer-based simulator. Using this software, students are able to practice their ability to identify the components of the original gas, thereby gaining a better understanding of the underlying physical laws. The software is available as a free download.

  5. Computational simulation methods for composite fracture mechanics

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    1988-01-01

    Structural integrity, durability, and damage tolerance of advanced composites are assessed by studying damage initiation at various scales (micro, macro, and global) and accumulation and growth leading to global failure, quantitatively and qualitatively. In addition, various fracture toughness parameters associated with a typical damage and its growth must be determined. Computational structural analysis codes to aid the composite design engineer in performing these tasks were developed. CODSTRAN (COmposite Durability STRuctural ANalysis) is used to qualitatively and quantitatively assess the progressive damage occurring in composite structures due to mechanical and environmental loads. Next, methods are covered that are currently being developed and used at Lewis to predict interlaminar fracture toughness and related parameters of fiber composites given a prescribed damage. The general purpose finite element code MSC/NASTRAN was used to simulate the interlaminar fracture and the associated individual as well as mixed-mode strain energy release rates in fiber composites.

  6. Miller experiments in atomistic computer simulations

    PubMed Central

    Saitta, Antonino Marco; Saija, Franz

    2014-01-01

    The celebrated Miller experiments reported on the spontaneous formation of amino acids from a mixture of simple molecules reacting under an electric discharge, giving birth to the research field of prebiotic chemistry. However, the chemical reactions involved in those experiments have never been studied at the atomic level. Here we report on, to our knowledge, the first ab initio computer simulations of Miller-like experiments in the condensed phase. Our study, based on the recent method of treatment of aqueous systems under electric fields and on metadynamics analysis of chemical reactions, shows that glycine spontaneously forms from mixtures of simple molecules once an electric field is switched on and identifies formic acid and formamide as key intermediate products of the early steps of the Miller reactions, and the crucible of formation of complex biological molecules. PMID:25201948

  7. Engineering Fracking Fluids with Computer Simulation

    NASA Astrophysics Data System (ADS)

    Shaqfeh, Eric

    2015-11-01

    There are no comprehensive simulation-based tools for engineering the flows of viscoelastic fluid-particle suspensions in fully three-dimensional geometries. On the other hand, the need for such a tool in engineering applications is immense. Suspensions of rigid particles in viscoelastic fluids play key roles in many energy applications. For example, in oil drilling the ``drilling mud'' is a very viscous, viscoelastic fluid designed to shear-thin during drilling, but thicken at stoppage so that the ``cuttings'' can remain suspended. In a related application known as hydraulic fracturing suspensions of solids called ``proppant'' are used to prop open the fracture by pumping them into the well. It is well-known that particle flow and settling in a viscoelastic fluid can be quite different from that which is observed in Newtonian fluids. First, it is now well known that the ``fluid particle split'' at bifurcation cracks is controlled by fluid rheology in a manner that is not understood. Second, in Newtonian fluids, the presence of an imposed shear flow in the direction perpendicular to gravity (which we term a cross or orthogonal shear flow) has no effect on the settling of a spherical particle in Stokes flow (i.e. at vanishingly small Reynolds number). By contrast, in a non-Newtonian liquid, the complex rheological properties induce a nonlinear coupling between the sedimentation and shear flow. Recent experimental data have shown both the shear thinning and the elasticity of the suspending polymeric solutions significantly affects the fluid-particle split at bifurcations, as well as the settling rate of the solids. In the present work, we use the Immersed Boundary Method to develop computer simulations of viscoelastic flow in suspensions of spheres to study these problems. These simulations allow us to understand the detailed physical mechanisms for the remarkable physical behavior seen in practice, and actually suggest design rules for creating new fluid recipes.

  8. Duality quantum computer and the efficient quantum simulations

    NASA Astrophysics Data System (ADS)

    Wei, Shi-Jie; Long, Gui-Lu

    2016-03-01

    Duality quantum computing is a new mode of a quantum computer to simulate a moving quantum computer passing through a multi-slit. It exploits the particle wave duality property for computing. A quantum computer with n qubits and a qudit simulates a moving quantum computer with n qubits passing through a d-slit. Duality quantum computing can realize an arbitrary sum of unitaries and therefore a general quantum operator, which is called a generalized quantum gate. All linear bounded operators can be realized by the generalized quantum gates, and unitary operators are just the extreme points of the set of generalized quantum gates. Duality quantum computing provides flexibility and a clear physical picture in designing quantum algorithms, and serves as a powerful bridge between quantum and classical algorithms. In this paper, after a brief review of the theory of duality quantum computing, we will concentrate on the applications of duality quantum computing in simulations of Hamiltonian systems. We will show that duality quantum computing can efficiently simulate quantum systems by providing descriptions of the recent efficient quantum simulation algorithm of Childs and Wiebe (Quantum Inf Comput 12(11-12):901-924, 2012) for the fast simulation of quantum systems with a sparse Hamiltonian, and the quantum simulation algorithm by Berry et al. (Phys Rev Lett 114:090502, 2015), which provides exponential improvement in precision for simulating systems with a sparse Hamiltonian.

  9. Problems in Conducting Research on Computer-Based Simulation.

    ERIC Educational Resources Information Center

    Crawford, Alice M.

    Computer-based simulation (CBS) represents a unique utilization of computers for instruction that combines some of the best features of the technologies of simulation and computer assisted instruction (CAI). CBS grew out of an interest in testing the application of CAI to procedural and perceptual motor skills. With the sophisticated graphics…

  10. Computer-aided simulation study of photomultiplier tubes

    NASA Technical Reports Server (NTRS)

    Zaghloul, Mona E.; Rhee, Do Jun

    1989-01-01

    A computer model that simulates the response of photomultiplier tubes (PMTs) and the associated voltage divider circuit is developed. An equivalent circuit that approximates the operation of the device is derived and then used to develop a computer simulation of the PMT. Simulation results are presented and discussed.

  11. Using Computational Simulations to Confront Students' Mental Models

    ERIC Educational Resources Information Center

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  12. Computer simulation of industrial hazards1

    PubMed Central

    Knox, E. G.

    1973-01-01

    Knox, E. G. (1973).Brit. J. industr. Med.,30, 54-63. Computer simulation of industrial hazards. A computer simulation system for a range of industrial hazards provided for model experiments which manipulated (a) the sub-structure of an exposed population in terms of age-distributions and levels of exposure, (b) the nature of the dose/response relationship, (c) the latent interval and its variability, (d) normal life-table expectations, and (e) employment turnover rates. The development of the system led to clarification of terms and concepts with ambiguous current usages, notably in relation to latency. Distinction is made between the notions of `biological' and `observable' latent intervals. Hypothetical exercises with the model tested its technical validity and at the same time demonstrated in quantitative terms the relationships between `biological' and `observable' latent intervals, employment turnover rates, total mortalities, and the distribution of illnesses and death between those currently employed in the exposing industry, those employed elsewhere, and those retired. Prospects of success for personnel engineering techniques, which manipulate age-distributions of exposed work people in relation to diseases with long latent intervals, were examined. Published asbestos cancer data were used as a basis for specific model fitting and resulted in a numerical formulation of the exposure/response relationships. Severe exposure results in an increment of risk of death of about 0·02 unit per person per annum for those exposed for around six years, but with higher rates for shorter exposures and lower rates for longer ones. The mean biological latent interval was about 25 years with a coefficient of variation of about 25%. These suppositions explained a range of published data comprehensively and at the same time predicted that (a) persons exposed at severe levels for a working lifetime of 50 years have a 40% risk of dying from asbestos cancer, and (b) industrial

  13. A computer simulation study of racemic mixtures

    NASA Astrophysics Data System (ADS)

    Largo, J.; Vega, C.; MacDowell, L. G.; Solana, J. R.

    A simple model for a chiral molecule is proposed. The model consists of a central atom bonded to four different atoms in tetrahedral coordination. Two different potentials were used to describe the pair potentials between atoms: the hard sphere potential and the Lennard-Jones potential. For both the hard sphere and the Lennard-Jones chiral models, computer simulations have been performed for the pure enantiomers and also for the racemic mixture. The racemic mixture consisted of an equimolar mixture of the two optically active enantiomers. It is found that the equations of state are the same, within statistical uncertainty, for the pure enantiomer fluid and for the racemic mixture. Only at high pressures does the racemic mixture seem to have a higher density, for a given pressure, than the pure enantiomer. Concering the structure, no difference is found in the site-site correlation functions between like and unlike molecules in the racemic mixture either at low or at high densities. However, small differences are found for the site-site correlations of the pure enantiomer and those of the racemic mixtures. In the Lennard-Jones model, similar conclusions are drawn. The extension of Wertheim's first-order perturbation theory, denoted bonded hard sphere theory (ARCHER, A. L., and JACKSON, G., 1991, Molec. Phys. , 73 , 881; AMOS, M. D., and JACKSON, G., 1992, J. chem. Phys. , 96 , 4604), successfully reproduces the simulation results for the hard chiral model. Virial coefficients of the hard chiral model up to the fourth have also been evaluated. Again, no differences are found between virial coefficients of the pure fluid and of the racemic mixture. All the results of this work illustrate the quasi-ideal behaviour of racemic mixtures in the fluid phase.

  14. Computer simulation of nanocube self-assemblies

    NASA Astrophysics Data System (ADS)

    Zhang, Xi

    Self-assembly of nanoscale building blocks and molecules into ordered nanostructures is a promising venue for bottom-up materials design. A wide variety of nanoparticles with unique shapes and uniform sizes have been successfully synthesized. However, organizing these nanoparticles into desired, predefined nanostructures is a formidable challenge now facing the materials community. For example, simple 2-D arrays and 3-D superlattices are the prevalent structures from most nanocube self-assemblies. Two practical strategies to impart anisotropy onto nanocubes, namely, attaching polymer tethers to nanoparticle surfaces and introducing directional dipolar interactions, can be applied to achieve more complex assembled structures. In this dissertation, we conduct computer simulations on nanocube self-assemblies induced by polymer tethers and directional dipole interactions, to examine the various parameters involved in such complicated self-assembly processes, including temperature, concentration, solvent condition, cube size, tether length, tether topology, tether placement, tether number, dipole direction, dipole strength and polydispersity, in order to understand how the packing geometry and interactions between nanocubes can be manipulated to confer precise control over the assembled structures and the phase behavior. First, we simulate monotethered nanocubes and find that the nanocubes favor face-to-face packing in poor solvents, stabilizing the lamellae phases. Next, we simulate different architectures of tethered nanocubes and demonstrate that the steric influence of tether beads can be manipulated to interfere with the face-to-face packing of nanocubes and alter the phase behaviors. We also study the self-assembly of nanocubes with dipoles. We find that the head-to-tail alignment of dipoles, coupled with the face-to-face close packing of nanocubes, dictates the assembled structures. The face-face attraction between nanocubes can also be utilized to control the

  15. Computational simulation of liquid rocket injector anomalies

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Singhal, A. K.; Tam, L. T.; Davidian, K.

    1986-01-01

    A computer model has been developed to analyze the three-dimensional two-phase reactive flows in liquid fueled rocket combustors. The model is designed to study the influence of liquid propellant injection nonuniformities on the flow pattern, combustion and heat transfer within the combustor. The Eulerian-Lagrangian approach for simulating polidisperse spray flow, evaporation and combustion has been used. Full coupling between the phases is accounted for. A nonorthogonal, body fitted coordinate system along with a conservative control volume formulation is employed. The physical models built into the model include a kappa-epsilon turbulence model, a two-step chemical reaction, and the six-flux radiation model. Semiempirical models are used to describe all interphase coupling terms as well as chemical reaction rates. The purpose of this study was to demonstrate an analytical capability to predict the effects of reactant injection nonuniformities (injection anomalies) on combustion and heat transfer within the rocket combustion chamber. The results show promising application of the model to comprehensive modeling of liquid propellant rocket engines.

  16. Computer simulation of FCC riser reactors.

    SciTech Connect

    Chang, S. L.; Golchert, B.; Lottes, S. A.; Petrick, M.; Zhou, C. Q.

    1999-04-20

    A three-dimensional computational fluid dynamics (CFD) code, ICRKFLO, was developed to simulate the multiphase reacting flow system in a fluid catalytic cracking (FCC) riser reactor. The code solve flow properties based on fundamental conservation laws of mass, momentum, and energy for gas, liquid, and solid phases. Useful phenomenological models were developed to represent the controlling FCC processes, including droplet dispersion and evaporation, particle-solid interactions, and interfacial heat transfer between gas, droplets, and particles. Techniques were also developed to facilitate numerical calculations. These techniques include a hybrid flow-kinetic treatment to include detailed kinetic calculations, a time-integral approach to overcome numerical stiffness problems of chemical reactions, and a sectional coupling and blocked-cell technique for handling complex geometry. The copyrighted ICRKFLO software has been validated with experimental data from pilot- and commercial-scale FCC units. The code can be used to evaluate the impacts of design and operating conditions on the production of gasoline and other oil products.

  17. Extensive all-atom Monte Carlo sampling and QM/MM corrections in the SAMPL4 hydration free energy challenge.

    PubMed

    Genheden, Samuel; Cabedo Martinez, Ana I; Criddle, Michael P; Essex, Jonathan W

    2014-03-01

    We present our predictions for the SAMPL4 hydration free energy challenge. Extensive all-atom Monte Carlo simulations were employed to sample the compounds in explicit solvent. While the focus of our study was to demonstrate well-converged and reproducible free energies, we attempted to address the deficiencies in the general Amber force field force field with a simple QM/MM correction. We show that by using multiple independent simulations, including different starting configurations, and enhanced sampling with parallel tempering, we can obtain well converged hydration free energies. Additional analysis using dihedral angle distributions, torsion-root mean square deviation plots and thermodynamic cycles support this assertion. We obtain a mean absolute deviation of 1.7 kcal mol(-1) and a Kendall's τ of 0.65 compared with experiment. PMID:24488307

  18. Computational investigations on polymerase actions in gene transcription and replication: Combining physical modeling and atomistic simulations

    NASA Astrophysics Data System (ADS)

    Jin, Yu

    2016-01-01

    Polymerases are protein enzymes that move along nucleic acid chains and catalyze template-based polymerization reactions during gene transcription and replication. The polymerases also substantially improve transcription or replication fidelity through the non-equilibrium enzymatic cycles. We briefly review computational efforts that have been made toward understanding mechano-chemical coupling and fidelity control mechanisms of the polymerase elongation. The polymerases are regarded as molecular information motors during the elongation process. It requires a full spectrum of computational approaches from multiple time and length scales to understand the full polymerase functional cycle. We stay away from quantum mechanics based approaches to the polymerase catalysis due to abundant former surveys, while addressing statistical physics modeling approaches along with all-atom molecular dynamics simulation studies. We organize this review around our own modeling and simulation practices on a single subunit T7 RNA polymerase, and summarize commensurate studies on structurally similar DNA polymerases as well. For multi-subunit RNA polymerases that have been actively studied in recent years, we leave systematical reviews of the simulation achievements to latest computational chemistry surveys, while covering only representative studies published very recently, including our own work modeling structure-based elongation kinetic of yeast RNA polymerase II. In the end, we briefly go through physical modeling on elongation pauses and backtracking activities of the multi-subunit RNAPs. We emphasize on the fluctuation and control mechanisms of the polymerase actions, highlight the non-equilibrium nature of the operation system, and try to build some perspectives toward understanding the polymerase impacts from the single molecule level to a genome-wide scale. Project supported by the National Natural Science Foundation (Grant No. 11275022).

  19. Simulation of reliability in multiserver computer networks

    NASA Astrophysics Data System (ADS)

    Minkevičius, Saulius

    2012-11-01

    The performance in terms of reliability of computer multiserver networks motivates this paper. The probability limit theorem is derived on the extreme queue length in open multiserver queueing networks in heavy traffic and applied to a reliability model for multiserver computer networks where we relate the time of failure of a multiserver computer network to the system parameters.

  20. Reconciling structural and thermodynamic predictions using all-atom and coarse-grain force fields: the case of charged oligo-arginine translocation into DMPC bilayers.

    PubMed

    Hu, Yuan; Sinha, Sudipta Kumar; Patel, Sandeep

    2014-10-16

    Using the translocation of short, charged cationic oligo-arginine peptides (mono-, di-, and triarginine) from bulk aqueous solution into model DMPC bilayers, we explore the question of the similarity of thermodynamic and structural predictions obtained from molecular dynamics simulations using all-atom and Martini coarse-grain force fields. Specifically, we estimate potentials of mean force associated with translocation using standard all-atom (CHARMM36 lipid) and polarizable and nonpolarizable Martini force fields, as well as a series of modified Martini-based parameter sets. We find that we are able to reproduce qualitative features of potentials of mean force of single amino acid side chain analogues into model bilayers. In particular, modifications of peptide-water and peptide-membrane interactions allow prediction of free energy minima at the bilayer-water interface as obtained with all-atom force fields. In the case of oligo-arginine peptides, the modified parameter sets predict interfacial free energy minima as well as free energy barriers in almost quantitative agreement with all-atom force field based simulations. Interfacial free energy minima predicted by a modified coarse-grained parameter set are -2.51, -4.28, and -5.42 for mono-, di-, and triarginine; corresponding values from all-atom simulations are -0.83, -3.33, and -3.29, respectively, all in units of kcal/mol. We found that a stronger interaction between oligo-arginine and the membrane components and a weaker interaction between oligo-arginine and water are crucial for producing such minima in PMFs using the polarizable CG model. The difference between bulk aqueous and bilayer center states predicted by the modified coarse-grain force field are 11.71, 14.14, and 16.53 kcal/mol, and those by the all-atom model are 6.94, 8.64, and 12.80 kcal/mol; those are of almost the same order of magnitude. Our simulations also demonstrate a remarkable similarity in the structural aspects of the ensemble of

  1. Computational simulations of vorticity enhanced diffusion

    NASA Astrophysics Data System (ADS)

    Vold, Erik L.

    1999-11-01

    Computer simulations are used to investigate a phenomenon of vorticity enhanced diffusion (VED), a net transport and mixing of a passive scalar across a prescribed vortex flow field driven by a background gradient in the scalar quantity. The central issue under study here is the increase in scalar flux down the gradient and across the vortex field. The numerical scheme uses cylindrical coordinates centered with the vortex flow which allows an exact advective solution and 1D or 2D diffusion using simple numerical methods. In the results, the ratio of transport across a localized vortex region in the presence of the vortex flow over that expected for diffusion alone is evaluated as a measure of VED. This ratio is seen to increase dramatically while the absolute flux across the vortex decreases slowly as the diffusion coefficient is decreased. Similar results are found and compared for varying diffusion coefficient, D, or vortex rotation time, τv, for a constant background gradient in the transported scalar vs an interface in the transported quantity, and for vortex flow fields constant in time vs flow which evolves in time from an initial state and with a Schmidt number of order unity. A simple analysis shows that for a small diffusion coefficient, the flux ratio measure of VED scales as the vortex radius over the thickness for mass diffusion in a viscous shear layer within the vortex characterized by (Dτv)1/2. The phenomenon is linear as investigated here and suggests that a significant enhancement of mixing in fluids may be a relatively simple linear process. Discussion touches on how this vorticity enhanced diffusion may be related to mixing in nonlinear turbulent flows.

  2. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.; Ziegler, C.

    1983-01-01

    A software simulator to help NASA in the design of the LMSS was developed. The simulator will be used to study the characteristics of implementation requirements of the LMSS's configuration with specifications as outlined by NASA.

  3. New Pedagogies on Teaching Science with Computer Simulations

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  4. How Effective Is Instructional Support for Learning with Computer Simulations?

    ERIC Educational Resources Information Center

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  5. Computer-Based Simulation Models for Community College Business Students.

    ERIC Educational Resources Information Center

    Kahl, James

    Instructors at Lower Columbia College in Longview, Washington use computer-based simulation models in lower level business administration courses. Prior to use, teachers must select and obtain a simulation, discuss it with campus computer personnel, set an operations schedule, obtain the necessary supplementary material, and test run the program.…

  6. Explore Effective Use of Computer Simulations for Physics Education

    ERIC Educational Resources Information Center

    Lee, Yu-Fen; Guo, Yuying

    2008-01-01

    The dual purpose of this article is to provide a synthesis of the findings related to the use of computer simulations in physics education and to present implications for teachers and researchers in science education. We try to establish a conceptual framework for the utilization of computer simulations as a tool for learning and instruction in…

  7. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    ERIC Educational Resources Information Center

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  8. Cognitive Effects from Process Learning with Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Breuer, Klaus; Kummer, Ruediger

    1990-01-01

    Discusses content learning versus process learning, describes process learning with computer-based simulations, and highlights an empirical study on the effects of process learning with problem-oriented, computer-managed simulations in technical vocational education classes in West Germany. Process learning within a model of the cognitive system…

  9. A thermodynamic study of Abeta(16-21) dissociation from a fibril using computer simulations

    NASA Astrophysics Data System (ADS)

    Dias, Cristiano; Mahmoudinobar, Farbod; Su, Zhaoqian

    Here, I will discuss recent all-atom molecular dynamics simulations with explicit water in which we studied the thermodynamic properties of Abeta(16-21) dissociation from an amyloid fibril. Changes in thermodynamics quantities, e.g., entropy, enthalpy, and volume, are computed from the temperature dependence of the free-energy computed using the umbrella sampling method. We find similarities and differences between the thermodynamics of peptide dissociation and protein unfolding. Similarly to protein unfolding, Abeta(16-21) dissociation is characterized by an unfavorable change in enthalpy, a favorable change in the entropic energy, and an increase in the heat capacity. A main difference is that peptide dissociation is characterized by a weak enthalpy-entropy compensation. We characterize dock and lock states of the peptide based on the solvent accessible surface area. The Lennard-Jones energy of the system is observed to increase continuously in lock and dock states as the peptide dissociates. The electrostatic energy increases in the lock state and it decreases in the dock state as the peptide dissociates. These results will be discussed as well as their implication for fibril growth.

  10. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  11. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  12. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  13. GPU-accelerated micromagnetic simulations using cloud computing

    NASA Astrophysics Data System (ADS)

    Jermain, C. L.; Rowlands, G. E.; Buhrman, R. A.; Ralph, D. C.

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  14. Case Studies in Computer Adaptive Test Design through Simulation.

    ERIC Educational Resources Information Center

    Eignor, Daniel R.; And Others

    The extensive computer simulation work done in developing the computer adaptive versions of the Graduate Record Examinations (GRE) Board General Test and the College Board Admissions Testing Program (ATP) Scholastic Aptitude Test (SAT) is described in this report. Both the GRE General and SAT computer adaptive tests (CATs), which are fixed length…

  15. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  16. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  17. All-Atom Internal Coordinate Mechanics (ICM) Force Field for Hexopyranoses and Glycoproteins

    PubMed Central

    2016-01-01

    We present an extension of the all-atom internal-coordinate force field, ICMFF, that allows for simulation of heterogeneous systems including hexopyranose saccharides and glycan chains in addition to proteins. A library of standard glycan geometries containing α- and β-anomers of the most common hexapyranoses, i.e., d-galactose, d-glucose, d-mannose, d-xylose, l-fucose, N-acetylglucosamine, N-acetylgalactosamine, sialic, and glucuronic acids, is created based on the analysis of the saccharide structures reported in the Cambridge Structural Database. The new force field parameters include molecular electrostatic potential-derived partial atomic charges and the torsional parameters derived from quantum mechanical data for a collection of minimal molecular fragments and related molecules. The ϕ/ψ torsional parameters for different types of glycosidic linkages are developed using model compounds containing the key atoms in the full carbohydrates, i.e., glycosidic-linked tetrahydropyran–cyclohexane dimers. Target data for parameter optimization include two-dimensional energy surfaces corresponding to the ϕ/ψ glycosidic dihedral angles in the disaccharide analogues, as determined by quantum mechanical MP2/6-31G** single-point energies on HF/6-31G** optimized structures. To achieve better agreement with the observed geometries of glycosidic linkages, the bond angles at the O-linkage atoms are added to the internal variable set and the corresponding bond bending energy term is parametrized using quantum mechanical data. The resulting force field is validated on glycan chains of 1–12 residues from a set of high-resolution X-ray glycoprotein structures based on heavy atom root-mean-square deviations of the lowest-energy glycan conformations generated by the biased probability Monte Carlo (BPMC) molecular mechanics simulations from the native structures. The appropriate BPMC distributions for monosaccharide–monosaccharide and protein–glycan linkages are derived

  18. Genetic Crossing vs Cloning by Computer Simulation

    NASA Astrophysics Data System (ADS)

    Dasgupta, Subinay

    We perform Monte Carlo simulation using Penna's bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  19. Spatial Learning and Computer Simulations in Science

    ERIC Educational Resources Information Center

    Lindgren, Robb; Schwartz, Daniel L.

    2009-01-01

    Interactive simulations are entering mainstream science education. Their effects on cognition and learning are often framed by the legacy of information processing, which emphasized amodal problem solving and conceptual organization. In contrast, this paper reviews simulations from the vantage of research on perception and spatial learning,…

  20. Genetic crossing vs cloning by computer simulation

    SciTech Connect

    Dasgupta, S.

    1997-06-01

    We perform Monte Carlo simulation using Penna`s bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  1. Parallel Computing Environments and Methods for Power Distribution System Simulation

    SciTech Connect

    Lu, Ning; Taylor, Zachary T.; Chassin, David P.; Guttromson, Ross T.; Studham, Scott S.

    2005-11-10

    The development of cost-effective high-performance parallel computing on multi-processor super computers makes it attractive to port excessively time consuming simulation software from personal computers (PC) to super computes. The power distribution system simulator (PDSS) takes a bottom-up approach and simulates load at appliance level, where detailed thermal models for appliances are used. This approach works well for a small power distribution system consisting of a few thousand appliances. When the number of appliances increases, the simulation uses up the PC memory and its run time increases to a point where the approach is no longer feasible to model a practical large power distribution system. This paper presents an effort made to port a PC-based power distribution system simulator (PDSS) to a 128-processor shared-memory super computer. The paper offers an overview of the parallel computing environment and a description of the modification made to the PDSS model. The performances of the PDSS running on a standalone PC and on the super computer are compared. Future research direction of utilizing parallel computing in the power distribution system simulation is also addressed.

  2. Use of advanced computers for aerodynamic flow simulation

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Ballhaus, W. F.

    1980-01-01

    The current and projected use of advanced computers for large-scale aerodynamic flow simulation applied to engineering design and research is discussed. The design use of mature codes run on conventional, serial computers is compared with the fluid research use of new codes run on parallel and vector computers. The role of flow simulations in design is illustrated by the application of a three dimensional, inviscid, transonic code to the Sabreliner 60 wing redesign. Research computations that include a more complete description of the fluid physics by use of Reynolds averaged Navier-Stokes and large-eddy simulation formulations are also presented. Results of studies for a numerical aerodynamic simulation facility are used to project the feasibility of design applications employing these more advanced three dimensional viscous flow simulations.

  3. Radiotherapy Monte Carlo simulation using cloud computing technology.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  4. Some theoretical issues on computer simulations

    SciTech Connect

    Barrett, C.L.; Reidys, C.M.

    1998-02-01

    The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.

  5. Computer simulation of water reclamation processors

    NASA Technical Reports Server (NTRS)

    Fisher, John W.; Hightower, T. M.; Flynn, Michael T.

    1991-01-01

    The development of detailed simulation models of water reclamation processors based on the ASPEN PLUS simulation program is discussed. Individual models have been developed for vapor compression distillation, vapor phase catalytic ammonia removal, and supercritical water oxidation. These models are used for predicting the process behavior. Particular attention is given to methodology which is used to complete this work, and the insights which are gained by this type of model development.

  6. Computer simulation of a few common process control systems

    SciTech Connect

    Muncy, M.P.

    1986-06-01

    This paper shows how to simulate five common process control systems on an IBM PC with a commercially available software package named TUTSIM. All steps involved in producing and checking each simulation are described as clearly as possible. Complete computer listings and output line plots are included to fully document each simulation. Sufficient information is provided so that readers of this paper can duplicate each simulation if they desire to do so. 10 refs., 13 figs., 11 tbls.

  7. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  8. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  9. Two inviscid computational simulations of separated flow about airfoils

    NASA Technical Reports Server (NTRS)

    Barnwell, R. W.

    1976-01-01

    Two inviscid computational simulations of separated flow about airfoils are described. The basic computational method is the line relaxation finite-difference method. Viscous separation is approximated with inviscid free-streamline separation. The point of separation is specified, and the pressure in the separation region is calculated. In the first simulation, the empiricism of constant pressure in the separation region is employed. This empiricism is easier to implement with the present method than with singularity methods. In the second simulation, acoustic theory is used to determine the pressure in the separation region. The results of both simulations are compared with experiment.

  10. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  11. In Silico Folding of a Three Helix Protein and Characterization of Its Free-Energy Landscape in an All-Atom Force Field

    NASA Astrophysics Data System (ADS)

    Herges, T.; Wenzel, W.

    2005-01-01

    We report the reproducible first-principles folding of the 40 amino-acid, three-helix headpiece of the HIV accessory protein in a recently developed all-atom free-energy force field. Six of 20 simulations using an adapted basin-hopping method converged to better than 3Å backbone rms deviation to the experimental structure. Using over 60 000 low-energy conformations of this protein, we constructed a decoy tree that completely characterizes its folding funnel.

  12. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  13. Numerical simulation of supersonic wake flow with parallel computers

    SciTech Connect

    Wong, C.C.; Soetrisno, M.

    1995-07-01

    Simulating a supersonic wake flow field behind a conical body is a computing intensive task. It requires a large number of computational cells to capture the dominant flow physics and a robust numerical algorithm to obtain a reliable solution. High performance parallel computers with unique distributed processing and data storage capability can provide this need. They have larger computational memory and faster computing time than conventional vector computers. We apply the PINCA Navier-Stokes code to simulate a wind-tunnel supersonic wake experiment on Intel Gamma, Intel Paragon, and IBM SP2 parallel computers. These simulations are performed to study the mean flow in the near wake region of a sharp, 7-degree half-angle, adiabatic cone at Mach number 4.3 and freestream Reynolds number of 40,600. Overall the numerical solutions capture the general features of the hypersonic laminar wake flow and compare favorably with the wind tunnel data. With a refined and clustering grid distribution in the recirculation zone, the calculated location of the rear stagnation point is consistent with the 2D axisymmetric and 3D experiments. In this study, we also demonstrate the importance of having a large local memory capacity within a computer node and the effective utilization of the number of computer nodes to achieve good parallel performance when simulating a complex, large-scale wake flow problem.

  14. Computer Simulation of Electric Field Lines.

    ERIC Educational Resources Information Center

    Kirkup, L.

    1985-01-01

    Describes a computer program which plots electric field line plots. Includes program listing, sample diagrams produced on a BBC model B microcomputer (which could be produced on other microcomputers by modifying the program), and a discussion of the properties of field lines. (JN)

  15. How Real Is a Computer Simulation?

    ERIC Educational Resources Information Center

    Higgins, John J.

    Two keywords "input" and "get," in the BASIC programming language provide a metaphor of the processes of response and intervention in a dialogue situation. Computer teaching activities can be programmed using one or both of these commands. There are at least five main types: the quiz or overt teaching program, the text processsing program, the…

  16. Permeability Coefficients of Lipophilic Compounds Estimated by Computer Simulations.

    PubMed

    Ghaemi, Zhaleh; Alberga, Domenico; Carloni, Paolo; Laio, Alessandro; Lattanzi, Gianluca

    2016-08-01

    The ability of a drug to cross the intestine-blood barrier is a key quantity for drug design and employment and is normally quantified by the permeability coefficient P, often evaluated in the so-called Caco-2 assay. This assay is based on measuring the initial growth rate of the concentration of the drug beyond the cellular barrier but not its steady-state flux through the membrane. This might lead to confusion since, in the case of lipophilic drugs, the initial slope is strongly affected by the retention of the drug in the membrane. This effect is well known but seldom considered in the assay. Here, we exploit all-atoms molecular dynamics and bias exchange metadynamics to calculate the concentration of two lipophilic drugs across a model membrane as a function of time. This allows estimating both the steady-state flux and the initial slope of the concentration growth and comparing Caco-2 and steady-state estimates of P. We show that our computational procedure is able to reproduce the experimental values, although these may differ from the permeability coefficients by orders of magnitude. Our findings are generalized by a simplified one-dimensional model of the permeation process that may act as a roadmap to assess which measure of membrane permeability would be more appropriate and, consequently, whether retention corrections should be included in estimates based on Caco-2 assays. PMID:27392273

  17. An Exercise in Biometrical Genetics Based on a Computer Simulation.

    ERIC Educational Resources Information Center

    Murphy, P. J.

    1983-01-01

    Describes an exercise in biometrical genetics based on the noninteractive use of a computer simulation of a wheat hydridization program. Advantages of using the material in this way are also discussed. (Author/JN)

  18. MINEXP, A Computer-Simulated Mineral Exploration Program

    ERIC Educational Resources Information Center

    Smith, Michael J.; And Others

    1978-01-01

    This computer simulation is designed to put students into a realistic decision making situation in mineral exploration. This program can be used with different exploration situations such as ore deposits, petroleum, ground water, etc. (MR)

  19. Parallel solvers for reservoir simulation on MIMD computers

    SciTech Connect

    Piault, E.; Willien, F.; Roux, F.X.

    1995-12-01

    We have investigated parallel solvers for reservoir simulation. We compare different solvers and preconditioners using T3D and SP1 parallel computers. We use block diagonal domain decomposition preconditioner with non-overlapping sub-domains.

  20. Advances in Monte Carlo computer simulation

    NASA Astrophysics Data System (ADS)

    Swendsen, Robert H.

    2011-03-01

    Since the invention of the Metropolis method in 1953, Monte Carlo methods have been shown to provide an efficient, practical approach to the calculation of physical properties in a wide variety of systems. In this talk, I will discuss some of the advances in the MC simulation of thermodynamics systems, with an emphasis on optimization to obtain a maximum of useful information.

  1. Bodies Falling with Air Resistance: Computer Simulation.

    ERIC Educational Resources Information Center

    Vest, Floyd

    1982-01-01

    Two models are presented. The first assumes that air resistance is proportional to the velocity of the falling body. The second assumes that air resistance is proportional to the square of the velocity. A program written in BASIC that simulates the second model is presented. (MP)

  2. Advanced Simulation and Computing Business Plan

    SciTech Connect

    Rummel, E.

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  3. Student Ecosystems Problem Solving Using Computer Simulation.

    ERIC Educational Resources Information Center

    Howse, Melissa A.

    The purpose of this study was to determine the procedural knowledge brought to, and created within, a pond ecology simulation by students. Environmental Decision Making (EDM) is an ecosystems modeling tool that allows users to pose their own problems and seek satisfying solutions. Of specific interest was the performance of biology majors who had…

  4. Understanding Islamist political violence through computational social simulation

    SciTech Connect

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G; Eberhardt, Ariane; Stradling, Seth G

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  5. All-Atom Structural Models of the Transmembrane Domains of Insulin and Type 1 Insulin-Like Growth Factor Receptors

    PubMed Central

    Mohammadiarani, Hossein; Vashisth, Harish

    2016-01-01

    The receptor tyrosine kinase superfamily comprises many cell-surface receptors including the insulin receptor (IR) and type 1 insulin-like growth factor receptor (IGF1R) that are constitutively homodimeric transmembrane glycoproteins. Therefore, these receptors require ligand-triggered domain rearrangements rather than receptor dimerization for activation. Specifically, binding of peptide ligands to receptor ectodomains transduces signals across the transmembrane domains for trans-autophosphorylation in cytoplasmic kinase domains. The molecular details of these processes are poorly understood in part due to the absence of structures of full-length receptors. Using MD simulations and enhanced conformational sampling algorithms, we present all-atom structural models of peptides containing 51 residues from the transmembrane and juxtamembrane regions of IR and IGF1R. In our models, the transmembrane regions of both receptors adopt helical conformations with kinks at Pro961 (IR) and Pro941 (IGF1R), but the C-terminal residues corresponding to the juxtamembrane region of each receptor adopt unfolded and flexible conformations in IR as opposed to a helix in IGF1R. We also observe that the N-terminal residues in IR form a kinked-helix sitting at the membrane–solvent interface, while homologous residues in IGF1R are unfolded and flexible. These conformational differences result in a larger tilt-angle of the membrane-embedded helix in IGF1R in comparison to IR to compensate for interactions with water molecules at the membrane–solvent interfaces. Our metastable/stable states for the transmembrane domain of IR, observed in a lipid bilayer, are consistent with a known NMR structure of this domain determined in detergent micelles, and similar states in IGF1R are consistent with a previously reported model of the dimerized transmembrane domains of IGF1R. Our all-atom structural models suggest potentially unique structural organization of kinase domains in each receptor. PMID

  6. Monte Carlo simulations on SIMD computer architectures

    SciTech Connect

    Burmester, C.P.; Gronsky, R.; Wille, L.T.

    1992-03-01

    Algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SMM) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carlo updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures.

  7. Computer Simulation of the Beating Human Heart

    NASA Astrophysics Data System (ADS)

    Peskin, Charles S.; McQueen, David M.

    2001-06-01

    The mechanical function of the human heart couples together the fluid mechanics of blood and the soft tissue mechanics of the muscular heart walls and flexible heart valve leaflets. We discuss a unified mathematical formulation of this problem in which the soft tissue looks like a specialized part of the fluid in which additional forces are applied. This leads to a computational scheme known as the Immersed Boundary (IB) method for solving the coupled equations of motion of the whole system. The IB method is used to construct a three-dimensional Virtual Heart, including representations of all four chambers of the heart and all four valves, in addition to the large arteries and veins that connect the heart to the rest of the circulation. The chambers, valves, and vessels are all modeled as collections of elastic (and where appropriate, actively contractile) fibers immersed in viscous incompressible fluid. Results are shown as a computer-generated video animation of the beating heart.

  8. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  9. Conceptually enhanced simulations: A computer tool for science teaching

    NASA Astrophysics Data System (ADS)

    Snir, Joseph; Smith, Carol; Grosslight, Lorraine

    1993-06-01

    In this paper, we consider a way computer simulations can be used to address the problem of teaching for conceptual change and understanding. After identifying three levels of understanding of a natural phenomenon (concrete, conceptual, and metaconceptual) that need to be addressed in school science, and classifying computer model systems and simulations more generally in terms of the design choices facing the programmer, we argue that there are ways to design computer simulations that can make them more powerful than laboratory models. In particular, computer simulations that provide an explicit representation for a set of interrelated concepts allow students to perceive what cannot be directly observed in laboratory experiments: representations for the concepts and ideas used for interpreting the experiment. Further, by embedding the relevant physical laws directly into the program code, these simulations allow for genuine discoveries. We describe how we applied these ideas in developing a computer simulation for a particular set of purposes: to help students grasp the distinction between mass and density and to understand the phenomenon of flotation in terms of these concepts. Finally, we reflect on the kinds of activities such conceptually enhanced simulations allow that may be important in bringing about the desired conceptual change.

  10. Computational Aerothermodynamic Simulation Issues on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; White, Jeffery A.

    2004-01-01

    The synthesis of physical models for gas chemistry and turbulence from the structured grid codes LAURA and VULCAN into the unstructured grid code FUN3D is described. A directionally Symmetric, Total Variation Diminishing (STVD) algorithm and an entropy fix (eigenvalue limiter) keyed to local cell Reynolds number are introduced to improve solution quality for hypersonic aeroheating applications. A simple grid-adaptation procedure is incorporated within the flow solver. Simulations of flow over an ellipsoid (perfect gas, inviscid), Shuttle Orbiter (viscous, chemical nonequilibrium) and comparisons to the structured grid solvers LAURA (cylinder, Shuttle Orbiter) and VULCAN (flat plate) are presented to show current capabilities. The quality of heating in 3D stagnation regions is very sensitive to algorithm options in general, high aspect ratio tetrahedral elements complicate the simulation of high Reynolds number, viscous flow as compared to locally structured meshes aligned with the flow.

  11. Computer simulations of particle-surface dynamics

    SciTech Connect

    Karo, A.M.; Hiskes, J.R.; DeBoni, T.M.

    1986-10-01

    Our simulations of particle-surface dynamics use the molecular dynamics codes that we have developed over the past several years. The initial state of a molecule and the parameters defining the incoming trajectory can be specifically described or randomly selected. Statistical analyses of the states of the particles and their trajectories following wall collisions are carried out by the code. We have carried out calculations at high center-of-mass energies and low incidence angles and have examined the survival fraction of molecules and the dependence upon the incoming trajectory. We report also on preliminary efforts that are being made to simulate sputtering and recombinant desorption processes, since the recombinant desorption of hydrogen from typical wall materials may be an important source for vibrationally-excited hydrogen in volume sources; for surface sources the presence of occluded hydrogen may affect the concentration of atomic species.

  12. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.

    1981-01-01

    A molecular dynamics technique based upon Lennard-Jones type pair interactions is used to investigate time-dependent as well as equilibrium properties. The case study deals with systems containing Si and O atoms. In this case a more involved potential energy function (PEF) is employed and the system is simulated via a Monte-Carlo procedure. This furnishes the equilibrium properties of the system at its interfaces and surfaces as well as in the bulk.

  13. Computer simulation of a geomagnetic substorm

    NASA Technical Reports Server (NTRS)

    Lyon, J. G.; Brecht, S. H.; Huba, J. D.; Fedder, J. A.; Palmadesso, P. J.

    1981-01-01

    A global two-dimensional simulation of a substormlike process occurring in earth's magnetosphere is presented. The results are consistent with an empirical substorm model - the neutral-line model. Specifically, the introduction of a southward interplanetary magnetic field forms an open magnetosphere. Subsequently, a substorm neutral line forms at about 15 earth radii or closer in the magnetotail, and plasma sheet thinning and plasma acceleration occur. Eventually the substorm neutral line moves tailward toward its presubstorm position.

  14. The Use of Computer Simulations in High School Curricula.

    ERIC Educational Resources Information Center

    Visich, Marian, Jr.; Braun, Ludwig

    The Huntington Computer Project has developed 17 simulation games which can be used for instructional purposes in high schools. These games were designed to run on digital computers and to deal with material from either biology, physics, or social studies. Distribution was achieved through the Digital Equipment Corporation, which disseminated…

  15. Evaluation of a Computer Simulation in a Therapeutics Case Discussion.

    ERIC Educational Resources Information Center

    Kinkade, Raenel E.; And Others

    1995-01-01

    A computer program was used to simulate a case presentation in pharmacotherapeutics. Students (n=24) used their knowledge of the disease (glaucoma) and various topical agents on the computer program's formulary to "treat" the patient. Comparison of results with a control group found the method as effective as traditional case presentation on…

  16. Application Of Computer Simulation To The Entertainment Industry

    NASA Astrophysics Data System (ADS)

    Mittelman, Phillip S.

    1983-10-01

    Images generated by computer have started to appear in feature films (TRON, Star Trek II), in television commercials and in animated films. Of particular interest is the use of computer generated imagery which simulates the images which a real camera might have made if the imaged objects had been real.

  17. A Digital Computer Simulation of Cardiovascular and Renal Physiology.

    ERIC Educational Resources Information Center

    Tidball, Charles S.

    1979-01-01

    Presents the physiological MACPEE, one of a family of digital computer simulations used in Canada and Great Britain. A general description of the model is provided, along with a sample of computer output format, options for making interventions, advanced capabilities, an evaluation, and technical information for running a MAC model. (MA)

  18. Frontiers in the Teaching of Physiology. Computer Literacy and Simulation.

    ERIC Educational Resources Information Center

    Tidball, Charles S., Ed.; Shelesnyak, M. C., Ed.

    Provided is a collection of papers on computer literacy and simulation originally published in The Physiology Teacher, supplemented by additional papers and a glossary of terms relevant to the field. The 12 papers are presented in five sections. An affirmation of conventional physiology laboratory exercises, coping with computer terminology, and…

  19. COFLO: A Computer Aid for Teaching Ecological Simulation.

    ERIC Educational Resources Information Center

    Le vow, Roy B.

    A computer-assisted course was designed to provide students with an understanding of modeling and simulation techniques in quantitiative ecology. It deals with continuous systems and has two segments. One develops mathematical and computer tools, beginning with abstract systems and their relation to physical systems. Modeling principles are next…

  20. Remote access of the ILLIAC 4. [computer flow distribution simulations

    NASA Technical Reports Server (NTRS)

    Stevens, K. G., Jr.

    1975-01-01

    The ILLIAC-4 hardware is described. The Illiac system, the Advanced Research Projects Agency computer network, and IMLAC PDS-1 are included. The space shuttle flow simulation is demonstrated to show the feasibility of using an advanced computer from a remote location.

  1. Cardiovascular Physiology Teaching: Computer Simulations vs. Animal Demonstrations.

    ERIC Educational Resources Information Center

    Samsel, Richard W.; And Others

    1994-01-01

    At the introductory level, the computer provides an effective alternative to using animals for laboratory teaching. Computer software can simulate the operation of multiple organ systems. Advantages of software include alteration of variables that are not easily changed in vivo, repeated interventions, and cost-effective hands-on student access.…

  2. Use of Computer Simulations in Microbial and Molecular Genetics.

    ERIC Educational Resources Information Center

    Wood, Peter

    1984-01-01

    Describes five computer programs: four simulations of genetic and physical mapping experiments and one interactive learning program on the genetic coding mechanism. The programs were originally written in BASIC for the VAX-11/750 V.3. mainframe computer and have been translated into Applesoft BASIC for Apple IIe microcomputers. (JN)

  3. Coached, Interactive Computer Simulations: A New Technology for Training.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    This paper provides an overview of a prototype simulation-centered intelligent computer-based training (CBT) system--implemented using expert system technology--which provides: (1) an environment in which trainees can learn and practice complex skills; (2) a computer-based coach or mentor to critique performance, suggest improvements, and provide…

  4. The Design, Development, and Evaluation of an Evaluative Computer Simulation.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper discusses evaluation design considerations for a computer based evaluation simulation developed at the University of Iowa College of Medicine in Cardiology to assess the diagnostic skills of primary care physicians and medical students. The simulation developed allows for the assessment of diagnostic skills of physicians in the…

  5. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  6. Students from Many Countries Use Computers to Simulate International Negotiations.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1991-01-01

    College students around the world confer by computer in the International Communications and Negotiation Simulation. The simulation is offered by schools within the curriculum or as an extracurricular activity, with faculty as coordinators. Student teams are given scenarios and country assignments, prepare a position paper, and participate in the…

  7. Investigating the Effectiveness of Computer Simulations for Chemistry Learning

    ERIC Educational Resources Information Center

    Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan

    2012-01-01

    Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…

  8. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    ERIC Educational Resources Information Center

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables as…

  9. Computer Simulation as an Aid to Managers of Training.

    ERIC Educational Resources Information Center

    Wagner, Harold; Butler, Patrick J.

    Research investigated computer simulations of a hypothetical self-paced training program to determine the utility of this technique as a planning aid for Army training program managers. The General Purpose Simulation System (GPSS) was selected as the programing language and the study was divided into three stages. In Stage I, the daily number of…

  10. Effectiveness of an Endodontic Diagnosis Computer Simulation Program.

    ERIC Educational Resources Information Center

    Fouad, Ashraf F.; Burleson, Joseph A.

    1997-01-01

    Effectiveness of a computer simulation to teach endodontic diagnosis was assessed using three groups (n=34,32,24) of dental students. All were lectured on diagnosis, pathology, and radiographic interpretation. One group then used the simulation, another had a seminar on the same material, and the third group had no further instruction. Results…

  11. Enhancing Computer Science Education with a Wireless Intelligent Simulation Environment

    ERIC Educational Resources Information Center

    Cook, Diane J.; Huber, Manfred; Yerraballi, Ramesh; Holder, Lawrence B.

    2004-01-01

    The goal of this project is to develop a unique simulation environment that can be used to increase students' interest and expertise in Computer Science curriculum. Hands-on experience with physical or simulated equipment is an essential ingredient for learning, but many approaches to training develop a separate piece of equipment or software for…

  12. Design Model for Learner-Centered, Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Hawley, Chandra L.; Duffy, Thomas M.

    This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

  13. Computer simulation program is adaptable to industrial processes

    NASA Technical Reports Server (NTRS)

    Schultz, F. E.

    1966-01-01

    The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.

  14. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  15. Computer Models Simulate Fine Particle Dispersion

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  16. Computer Simulations of Coupled Piano Strings

    NASA Astrophysics Data System (ADS)

    Albert, Destiny L.

    1997-03-01

    The behavior of coupled piano strings is studied by using a finite difference scheme. The coupling of the strings produce motion in two transverse directions, parallel and perpendicular to the soundboard. The sound induced shows two decay rates, a rapid decay followed by a slow decay. These effects are in agreement with experimental results. (Weinreich, Gabriel. "The Coupled Motion of Piano Strings." Scientific American. January 1979) . Our simulations suggest that the motion of the end supports contributes to the elliptical motion of the strings. Furthermore, multiple strings contribute to the quality of the sound produced by a piano.

  17. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  18. Computer simulation of vasectomy for wolf control

    USGS Publications Warehouse

    Haight, R.G.; Mech, L.D.

    1997-01-01

    Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.

  19. A heterogeneous computing environment for simulating astrophysical fluid flows

    NASA Technical Reports Server (NTRS)

    Cazes, J.

    1994-01-01

    In the Concurrent Computing Laboratory in the Department of Physics and Astronomy at Louisiana State University we have constructed a heterogeneous computing environment that permits us to routinely simulate complicated three-dimensional fluid flows and to readily visualize the results of each simulation via three-dimensional animation sequences. An 8192-node MasPar MP-1 computer with 0.5 GBytes of RAM provides 250 MFlops of execution speed for our fluid flow simulations. Utilizing the parallel virtual machine (PVM) language, at periodic intervals data is automatically transferred from the MP-1 to a cluster of workstations where individual three-dimensional images are rendered for inclusion in a single animation sequence. Work is underway to replace executions on the MP-1 with simulations performed on the 512-node CM-5 at NCSA and to simultaneously gain access to more potent volume rendering workstations.

  20. Positive Wigner Functions Render Classical Simulation of Quantum Computation Efficient

    NASA Astrophysics Data System (ADS)

    Mari, A.; Eisert, J.

    2012-12-01

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  1. Computational methods for coupling microstructural and micromechanical materials response simulations

    SciTech Connect

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  2. Computer Simulations of Supercooled Liquids and Glasses

    NASA Astrophysics Data System (ADS)

    Kob, Walter

    Glasses are materials that are ubiquitous in our daily life. We find them in such diverse items as window pans, optical fibers, computer chips, ceramics, all of which are oxide glasses, as well as in food, foams, polymers, gels, which are mainly of organic nature. Roughly speaking glasses are solid materials that have no translational or orientational order on the scale beyond O(10) diameters of the constituent particles (atoms, colloids, …) [1]. Note that these materials are not necessarily homogeneous since, e.g., alkali-glasses such as Na2O-SiO2 show (disordered!) structural features on the length scale of 6-10 Å (compare to the interatomic distance of 1-2 Å) and gels can have structural inhomogeneities that extend up to macroscopic length scales.

  3. Computing abstraction hierarchies by numerical simulation

    SciTech Connect

    Bundy, A.; Giunchiglia, F.; Sebastiani, R.; Walsh, T.

    1996-12-31

    We present a novel method for building ABSTRIPS-style abstraction hierarchies in planning. The aim of this method is to minimize the amount of backtracking between abstraction levels. Previous approaches have determined the criticality of operator preconditions by reasoning about plans directly. Here, we adopt a simpler and faster approach where we use numerical simulation of the planning process. We demonstrate the theoretical advantages of our approach by identifying some simple properties lacking in previous approaches but possessed by our method. We demonstrate the empirical advantages of our approach by a set of four benchmark experiments using the ABTWEAK system. We compare the quality of the abstraction hierarchies generated with those built by the ALPINE and HIGHPOINT algorithms.

  4. Computer simulations of adsorbed liquid crystal films

    NASA Astrophysics Data System (ADS)

    Wall, Greg D.; Cleaver, Douglas J.

    2003-01-01

    The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.

  5. Computational Simulations and the Scientific Method

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  6. Osmosis : a molecular dynamics computer simulation study

    NASA Astrophysics Data System (ADS)

    Lion, Thomas

    Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..

  7. Computation simulation of the nonlinear response of suspension bridges

    SciTech Connect

    McCallen, D.B.; Astaneh-Asl, A.

    1997-10-01

    Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.

  8. Derivation and Systematic Validation of a Refined All-Atom Force Field for Phosphatidylcholine Lipids

    PubMed Central

    2012-01-01

    An all-atomistic force field (FF) has been developed for fully saturated phospholipids. The parametrization has been largely based on high-level ab initio calculations in order to keep the empirical input to a minimum. Parameters for the lipid chains have been developed based on knowledge about bulk alkane liquids, for which thermodynamic and dynamic data are excellently reproduced. The FFs ability to simulate lipid bilayers in the liquid crystalline phase in a tensionless ensemble was tested in simulations of three lipids: 1,2-diauroyl-sn-glycero-3-phospocholine (DLPC), 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC), and 1,2-dipalmitoyl-sn-glycero-3-phospcholine (DPPC). Computed areas and volumes per lipid, and three different kinds of bilayer thicknesses, have been investigated. Most importantly NMR order parameters and scattering form factors agree in an excellent manner with experimental data under a range of temperatures. Further, the compatibility with the AMBER FF for biomolecules as well as the ability to simulate bilayers in gel phase was demonstrated. Overall, the FF presented here provides the important balance between the hydrophilic and hydrophobic forces present in lipid bilayers and therefore can be used for more complicated studies of realistic biological membranes with protein insertions. PMID:22352995

  9. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use.

  10. Macroevolution simulated with autonomously replicating computer programs.

    PubMed

    Yedid, Gabriel; Bell, Graham

    The process of adaptation occurs on two timescales. In the short term, natural selection merely sorts the variation already present in a population, whereas in the longer term genotypes quite different from any that were initially present evolve through the cumulation of new mutations. The first process is described by the mathematical theory of population genetics. However, this theory begins by defining a fixed set of genotypes and cannot provide a satisfactory analysis of the second process because it does not permit any genuinely new type to arise. The evolutionary outcome of selection acting on novel variation arising over long periods is therefore difficult to predict. The classical problem of this kind is whether 'replaying the tape of life' would invariably lead to the familiar organisms of the modern biota. Here we study the long-term behaviour of populations of autonomously replicating computer programs and find that the same type, introduced into the same simple environment, evolves on any given occasion along a unique trajectory towards one of many well-adapted end points.

  11. Computer simulation of normal and pathological copper metabolism in man.

    PubMed

    Blincoe, C

    1993-01-01

    A digital computer simulation of copper metabolism was used to simulate human copper metabolism. The simulation agrees well with the normal data extant. Wilson's disease (hepatolenticular degeneration) and Menkes' disease (steely-hair syndrome) were simulated. Simulation of the unavailability of accumulated liver copper simulated Wilson's disease if it was assumed that the increased urinary excretion was due to induction of an enzymic mechanism for enhanced excretion. This would be consistent with the genetic defect causing only the sequestering of unavailable copper in the liver. Other genetic defects need not be present. Menkes' disease is also a genetic disease affecting the newborn. It was simulated successfully as a defect in absorption of copper from the gastrointestinal tract.

  12. High field electrophoresis—computer simulations

    NASA Astrophysics Data System (ADS)

    Krawczyk, M. J.; Kułakowski, K.

    2004-11-01

    We describe for the first time the results, obtained by means of a new two-dimensional version of a cellular automaton (2DA), designed for the simulation of the gel electrophoresis at high fields. The calculations are performed up to N=442 reptons. The results are compared with those from a modified version of the one-dimensional automaton (1DA), which has been constructed previously. The modification is that the movements of different parts of a molecule of DNA are treated as statistically independent events. This approach is applied also for 2DA. Main results are: (i) for long molecules (N≫1) the velocity v tends to a constant both for 1DA and 2DA; (ii) the diffusion coefficient D for 2DA increases with N; (iii) 2DA enables the formation of so-called hernias, i.e. fragments of DNA locally perpendicular to the molecule, and (iv) a direct observation of the geometration effect. The results (i) and (ii) mimic the experimental behavior at high electric fields. We also calculate a dimensionless quantity y=D/(Lv), where L=Na is the molecule length and a is the stiffness length. The discussion of y reveals the role of the length fluctuations.

  13. Computer simulation of polypeptides in a confinement.

    PubMed

    Sikorski, Andrzej; Romiszowski, Piotr

    2007-02-01

    A coarse-grained model of polypeptide chains confined in a slit formed by two parallel impenetrable surfaces was studied. The chains were flexible heteropolymers (polypeptides) built of two kinds of united atoms-hydrophobic and hydrophilic. The positions of the united atoms were restricted to the vertices of a [310] lattice. The force field consisted of a rigorous excluded volume, a long-distance potential between a pair of amino-acid residues and a local preference for forming secondary structure (helices). The properties of the chains were studied at a wide range of temperatures from good to bad solvent conditions. Monte-Carlo simulations were carried out using the algorithm based on the chain's local changes of conformation and employing the Replica Exchange technique. The influence of the chain length, the distances between the confining surfaces, the temperature and the force field on the dimension and the structure of chains were studied. It was shown that the presence of the confinement chain complicates the process of the chain collapse to low-temperature structures. For some conditions, one can find a rapid decrease of chain size and a second transition indicated by the rapid decrease of the total energy of the system.

  14. Structure and function of photosystem I–[FeFe] hydrogenase protein fusions: An all-atom molecular dynamics study

    SciTech Connect

    Harris, Bradley J.; Cheng, Xiaolin; Frymier, Paul

    2015-12-15

    All-atom molecular dynamics (MD) simulation was used to study the solution dynamics and protein protein interactions of protein fusions of photosystem I (PSI) from Thermosynechococcus elongatus and an [FeFe]-hydrogenase (FeFe H2ase) from Clostridium pasteurianum, a unique complex capable of photocatalytic hydrogen production. This study involved fusions of these two proteins via dithiol linkers of different length including decanedithiol, octanedithiol, and hexanedithiol, for which experimental data had previously been obtained. Evaluation of root-mean-squared deviations (RMSDs) relative to the respective crystal structures of PSI and the FeFe H2ase shows that these fusion complexes approach stable equilibrium conformations during the MD simulations. Investigating protein mobility via root-mean-squared fluctuations (RMSFs) reveals that tethering via the shortest hexanedithiol linker results in increased atomic fluctuations of both PSI and the hydrogenase in these fusion complexes. Furthermore, evaluation of the inter- and intraprotein electron transfer distances in these fusion complexes indicates that the structural changes in the FeFe H2ase arising from ligation to PSI via the shortest hexanedithiol linker may hinder electron transport in the hydrogenase, thus providing a molecular level explanation for the observation that the medium-length octanedithiol linker gives the highest hydrogen production rate.

  15. Structure and function of photosystem I–[FeFe] hydrogenase protein fusions: An all-atom molecular dynamics study

    DOE PAGES

    Harris, Bradley J.; Cheng, Xiaolin; Frymier, Paul

    2015-12-15

    All-atom molecular dynamics (MD) simulation was used to study the solution dynamics and protein protein interactions of protein fusions of photosystem I (PSI) from Thermosynechococcus elongatus and an [FeFe]-hydrogenase (FeFe H2ase) from Clostridium pasteurianum, a unique complex capable of photocatalytic hydrogen production. This study involved fusions of these two proteins via dithiol linkers of different length including decanedithiol, octanedithiol, and hexanedithiol, for which experimental data had previously been obtained. Evaluation of root-mean-squared deviations (RMSDs) relative to the respective crystal structures of PSI and the FeFe H2ase shows that these fusion complexes approach stable equilibrium conformations during the MD simulations. Investigatingmore » protein mobility via root-mean-squared fluctuations (RMSFs) reveals that tethering via the shortest hexanedithiol linker results in increased atomic fluctuations of both PSI and the hydrogenase in these fusion complexes. Furthermore, evaluation of the inter- and intraprotein electron transfer distances in these fusion complexes indicates that the structural changes in the FeFe H2ase arising from ligation to PSI via the shortest hexanedithiol linker may hinder electron transport in the hydrogenase, thus providing a molecular level explanation for the observation that the medium-length octanedithiol linker gives the highest hydrogen production rate.« less

  16. Computational algorithms to simulate the steel continuous casting

    NASA Astrophysics Data System (ADS)

    Ramírez-López, A.; Soto-Cortés, G.; Palomar-Pardavé, M.; Romero-Romo, M. A.; Aguilar-López, R.

    2010-10-01

    Computational simulation is a very powerful tool to analyze industrial processes to reduce operating risks and improve profits from equipment. The present work describes the development of some computational algorithms based on the numerical method to create a simulator for the continuous casting process, which is the most popular method to produce steel products for metallurgical industries. The kinematics of industrial processing was computationally reproduced using subroutines logically programmed. The cast steel by each strand was calculated using an iterative method nested in the main loop. The process was repeated at each time step (Δ t) to calculate the casting time, simultaneously, the steel billets produced were counted and stored. The subroutines were used for creating a computational representation of a continuous casting plant (CCP) and displaying the simulation of the steel displacement through the CCP. These algorithms have been developed to create a simulator using the programming language C++. Algorithms for computer animation of the continuous casting process were created using a graphical user interface (GUI). Finally, the simulator functionality was shown and validated by comparing with the industrial information of the steel production of three casters.

  17. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  18. Modeling and Computer Simulation: Molecular Dynamics and Kinetic Monte Carlo

    SciTech Connect

    Wirth, B.D.; Caturla, M.J.; Diaz de la Rubia, T.

    2000-10-10

    Recent years have witnessed tremendous advances in the realistic multiscale simulation of complex physical phenomena, such as irradiation and aging effects of materials, made possible by the enormous progress achieved in computational physics for calculating reliable, yet tractable interatomic potentials and the vast improvements in computational power and parallel computing. As a result, computational materials science is emerging as an important complement to theory and experiment to provide fundamental materials science insight. This article describes the atomistic modeling techniques of molecular dynamics (MD) and kinetic Monte Carlo (KMC), and an example of their application to radiation damage production and accumulation in metals. It is important to note at the outset that the primary objective of atomistic computer simulation should be obtaining physical insight into atomic-level processes. Classical molecular dynamics is a powerful method for obtaining insight about the dynamics of physical processes that occur on relatively short time scales. Current computational capability allows treatment of atomic systems containing as many as 10{sup 9} atoms for times on the order of 100 ns (10{sup -7}s). The main limitation of classical MD simulation is the relatively short times accessible. Kinetic Monte Carlo provides the ability to reach macroscopic times by modeling diffusional processes and time-scales rather than individual atomic vibrations. Coupling MD and KMC has developed into a powerful, multiscale tool for the simulation of radiation damage in metals.

  19. Embedding quantum simulators for quantum computation of entanglement.

    PubMed

    Di Candia, R; Mejia, B; Castillo, H; Pedernales, J S; Casanova, J; Solano, E

    2013-12-13

    We introduce the concept of embedding quantum simulators, a paradigm allowing the efficient quantum computation of a class of bipartite and multipartite entanglement monotones. It consists in the suitable encoding of a simulated quantum dynamics in the enlarged Hilbert space of an embedding quantum simulator. In this manner, entanglement monotones are conveniently mapped onto physical observables, overcoming the necessity of full tomography and reducing drastically the experimental requirements. Furthermore, this method is directly applicable to pure states and, assisted by classical algorithms, to the mixed-state case. Finally, we expect that the proposed embedding framework paves the way for a general theory of enhanced one-to-one quantum simulators.

  20. Computer simulation tests of optimized neutron powder diffractometer configurations

    NASA Astrophysics Data System (ADS)

    Cussen, L. D.; Lieutenant, K.

    2016-06-01

    Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

  1. Executive Summary: Special Section on Credible Computational Fluid Dynamics Simulations

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.

    1998-01-01

    This summary presents the motivation for the Special Section on the credibility of computational fluid dynamics (CFD) simulations, its objective, its background and context, its content, and its major conclusions. Verification and validation (V&V) are the processes for establishing the credibility of CFD simulations. Validation assesses whether correct things are performed and verification assesses whether they are performed correctly. Various aspects of V&V are discussed. Progress is made in verification of simulation models. Considerable effort is still needed for developing a systematic validation method that can assess the credibility of simulated reality.

  2. HPC Infrastructure for Solid Earth Simulation on Parallel Computers

    NASA Astrophysics Data System (ADS)

    Nakajima, K.; Chen, L.; Okuda, H.

    2004-12-01

    Recently, various types of parallel computers with various types of architectures and processing elements (PE) have emerged, which include PC clusters and the Earth Simulator. Moreover, users can easily access to these computer resources through network on Grid environment. It is well-known that thorough tuning is required for programmers to achieve excellent performance on each computer. The method for tuning strongly depends on the type of PE and architecture. Optimization by tuning is a very tough work, especially for developers of applications. Moreover, parallel programming using message passing library such as MPI is another big task for application programmers. In GeoFEM project (http://gefeom.tokyo.rist.or.jp), authors have developed a parallel FEM platform for solid earth simulation on the Earth Simulator, which supports parallel I/O, parallel linear solvers and parallel visualization. This platform can efficiently hide complicated procedures for parallel programming and optimization on vector processors from application programmers. This type of infrastructure is very useful. Source codes developed on PC with single processor is easily optimized on massively parallel computer by linking the source code to the parallel platform installed on the target computer. This parallel platform, called HPC Infrastructure will provide dramatic efficiency, portability and reliability in development of scientific simulation codes. For example, line number of the source codes is expected to be less than 10,000 and porting legacy codes to parallel computer takes 2 or 3 weeks. Original GeoFEM platform supports only I/O, linear solvers and visualization. In the present work, further development for adaptive mesh refinement (AMR) and dynamic load-balancing (DLB) have been carried out. In this presentation, examples of large-scale solid earth simulation using the Earth Simulator will be demonstrated. Moreover, recent results of a parallel computational steering tool using an

  3. Computer simulation for optimization of offshore platform evacuation

    SciTech Connect

    Soma, H.; Drager, K.H.; Bjoerdal, P.

    1996-12-31

    A method for optimizing the evacuation system on offshore platforms, in which computer simulation provides a main contribution, is presented. The use of computer simulation in offshore projects is explained, and the contribution with respect to input to the Quantitative Risk Analyses (QRA) and to the engineering is also presented. In order to design an optimum evacuation system on offshore platforms, detailed analyses and sensitivity calculations are required. By utilizing computer programs and simulation tools, the work load is no longer prohibitive for comprehensive optimization calculations to be performed. The evacuation system can accordingly be designed based on engineering considerations, rather than mainly relying on the preferences of the design team involved in the project. A description of three computer programs which perform stochastic reliability analyses of evacuation operations is presented; Evacuation Simulations (EVACSIM) simulates the evacuation (egress) of personnel on the platform, Lifeboat Launch for Conventional lifeboats (LBL-C) simulates the launch and escape operation of davit launched lifeboats and Lifeboat Launch for Free fall lifeboats (LBL-F) simulates the launch and escape operation of slide launched or vertical drop free fall lifeboats. Other computer programs that analyze parts of the evacuation process, such as Offshore Rescue Simulation (ORS), are mentioned. The result of this synthesis is an estimate of the yearly number of lives lost during evacuation of a platform, which is a suitable parameter for optimizing the evacuation system and deciding improvements. The impact of changing design parameters is found by carrying out evacuation analyses for the revised design (i.e., a sensitivity) and comparing the resulting loss of lives with the Base Case results. By systematizing this approach, the evacuation system on the platform can thus be optimized.

  4. Computers vs. wind tunnels for aerodynamic flow simulations

    NASA Technical Reports Server (NTRS)

    Chapman, D. R.; Mark, H.; Pirtle, M. W.

    1975-01-01

    It is pointed out that in other fields of computational physics, such as ballistics, celestial mechanics, and neutronics, computations have already displaced experiments as the principal means of obtaining dynamic simulations. In the case of aerodynamic investigations, the complexity of the computational work involved in solving the Navier-Stokes equations is the reason that such investigations rely currently mainly on wind-tunnel testing. However, because of inherent limitations of the wind-tunnel approach and economic considerations, it appears that at some time in the future aerodynamic studies will chiefly rely on computational flow data provided by the computer. Taking into account projected development trends, it is estimated that computers with the required capabilities for a solution of the complete viscous, time-dependent Navier-Stokes equations will be available in the mid-1980s.

  5. An All-Atom Force Field for Tertiary Structure Prediction of Helical Proteins

    PubMed Central

    Herges, T.; Wenzel, W.

    2004-01-01

    We have developed an all-atom free-energy force field (PFF01) for protein tertiary structure prediction. PFF01 is based on physical interactions and was parameterized using experimental structures of a family of proteins believed to span a wide variety of possible folds. It contains empirical, although sequence-independent terms for hydrogen bonding. Its solvent-accessible surface area solvent model was first fit to transfer energies of small peptides. The parameters of the solvent model were then further optimized to stabilize the native structure of a single protein, the autonomously folding villin headpiece, against competing low-energy decoys. Here we validate the force field for five nonhomologous helical proteins with 20–60 amino acids. For each protein, decoys with 2–3 Å backbone root mean-square deviation and correct experimental Cβ–Cβ distance constraints emerge as those with the lowest energy. PMID:15507688

  6. Understanding amyloid fibril nucleation and aβ oligomer/drug interactions from computer simulations.

    PubMed

    Nguyen, Phuong; Derreumaux, Philippe

    2014-02-18

    the critical nucleus size might be on the order of 20 chains under physiological conditions. The transition state might be characterized by a simultaneous change from mixed antiparallel/parallel β-strands with random side-chain packing to the final antiparallel or parallel states with the steric zipper packing of the side chains. Second, we review our current computer-based knowledge of the 3D structures of inhibitors with Aβ42 monomer and oligomers, a prerequisite for developing new drugs against AD. Recent extensive all-atom simulations of Aβ42 dimers with known inhibitors such as the green tea compound epigallocatechin-3-gallate and 1,4-naphthoquinon-2-yl-l-tryptophan provide a spectrum of initial Aβ42/inhibitor structures useful for screening and drug design. We conclude by discussing future directions that may offer opportunities to fully understand nucleation and further AD drug development.

  7. High performance computing for domestic petroleum reservoir simulation

    SciTech Connect

    Zyvoloski, G.; Auer, L.; Dendy, J.

    1996-06-01

    This is the final report of a two-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory. High-performance computing offers the prospect of greatly increasing the resolution at which petroleum reservoirs can be represented in simulation models. The increases in resolution can be achieved through large increases in computational speed and memory, if machine architecture and numerical methods for solution of the multiphase flow equations can be used to advantage. Perhaps more importantly, the increased speed and size of today`s computers make it possible to add physical processes to simulation codes that heretofore were too expensive in terms of computer time and memory to be practical. These factors combine to allow the development of new, more accurate methods for optimizing petroleum reservoir production.

  8. KU-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Griffin, J. W.

    1980-01-01

    The preparation of a real time computer simulation model of the KU band rendezvous radar to be integrated into the shuttle mission simulator (SMS), the shuttle engineering simulator (SES), and the shuttle avionics integration laboratory (SAIL) simulator is described. To meet crew training requirements a radar tracking performance model, and a target modeling method were developed. The parent simulation/radar simulation interface requirements, and the method selected to model target scattering properties, including an application of this method to the SPAS spacecraft are described. The radar search and acquisition mode performance model and the radar track mode signal processor model are examined and analyzed. The angle, angle rate, range, and range rate tracking loops are also discussed.

  9. High performance stream computing for particle beam transport simulations

    NASA Astrophysics Data System (ADS)

    Appleby, R.; Bailey, D.; Higham, J.; Salt, M.

    2008-07-01

    Understanding modern particle accelerators requires simulating charged particle transport through the machine elements. These simulations can be very time consuming due to the large number of particles and the need to consider many turns of a circular machine. Stream computing offers an attractive way to dramatically improve the performance of such simulations by calculating the simultaneous transport of many particles using dedicated hardware. Modern Graphics Processing Units (GPUs) are powerful and affordable stream computing devices. The results of simulations of particle transport through the booster-to-storage-ring transfer line of the DIAMOND synchrotron light source using an NVidia GeForce 7900 GPU are compared to the standard transport code MAD. It is found that particle transport calculations are suitable for stream processing and large performance increases are possible. The accuracy and potential speed gains are compared and the prospects for future work in the area are discussed.

  10. Accuracy vs. computational time: translating aortic simulations to the clinic.

    PubMed

    Brown, Alistair G; Shi, Yubing; Marzo, Alberto; Staicu, Cristina; Valverde, Isra; Beerbaum, Philipp; Lawford, Patricia V; Hose, D Rodney

    2012-02-01

    State of the art simulations of aortic haemodynamics feature full fluid-structure interaction (FSI) and coupled 0D boundary conditions. Such analyses require not only significant computational resource but also weeks to months of run time, which compromises the effectiveness of their translation to a clinical workflow. This article employs three computational fluid methodologies, of varying levels of complexity with coupled 0D boundary conditions, to simulate the haemodynamics within a patient-specific aorta. The most comprehensive model is a full FSI simulation. The simplest is a rigid walled incompressible fluid simulation while an alternative middle-ground approach employs a compressible fluid, tuned to elicit a response analogous to the compliance of the aortic wall. The results demonstrate that, in the context of certain clinical questions, the simpler analysis methods may capture the important characteristics of the flow field.

  11. Simulation of an interferometric computed tomography system for intraocular lenses

    NASA Astrophysics Data System (ADS)

    Tayag, Tristan J.; Bachim, Brent L.

    2010-08-01

    In this paper, we present a metrology system to characterize the refractive index profile of intraocular lenses (IOLs). Our system is based on interferometric optical phase computed tomography. We believe this metrology system to be a key enabling technology in the development of the next generation of IOLs. We propose a Fizeau-based optical configuration and present a simulation study on the application of computed tomography to IOL characterization.

  12. Method for simulating paint mixing on computer monitors

    NASA Astrophysics Data System (ADS)

    Carabott, Ferdinand; Lewis, Garth; Piehl, Simon

    2002-06-01

    Computer programs like Adobe Photoshop can generate a mixture of two 'computer' colors by using the Gradient control. However, the resulting colors diverge from the equivalent paint mixtures in both hue and value. This study examines why programs like Photoshop are unable to simulate paint or pigment mixtures, and offers a solution using Photoshops existing tools. The article discusses how a library of colors, simulating paint mixtures, is created from 13 artists' colors. The mixtures can be imported into Photoshop as a color swatch palette of 1248 colors and as 78 continuous or stepped gradient files, all accessed in a new software package, Chromafile.

  13. Energy Efficient Biomolecular Simulations with FPGA-based Reconfigurable Computing

    SciTech Connect

    Hampton, Scott S; Agarwal, Pratul K

    2010-05-01

    Reconfigurable computing (RC) is being investigated as a hardware solution for improving time-to-solution for biomolecular simulations. A number of popular molecular dynamics (MD) codes are used to study various aspects of biomolecules. These codes are now capable of simulating nanosecond time-scale trajectories per day on conventional microprocessor-based hardware, but biomolecular processes often occur at the microsecond time-scale or longer. A wide gap exists between the desired and achievable simulation capability; therefore, there is considerable interest in alternative algorithms and hardware for improving the time-to-solution of MD codes. The fine-grain parallelism provided by Field Programmable Gate Arrays (FPGA) combined with their low power consumption make them an attractive solution for improving the performance of MD simulations. In this work, we use an FPGA-based coprocessor to accelerate the compute-intensive calculations of LAMMPS, a popular MD code, achieving up to 5.5 fold speed-up on the non-bonded force computations of the particle mesh Ewald method and up to 2.2 fold speed-up in overall time-to-solution, and potentially an increase by a factor of 9 in power-performance efficiencies for the pair-wise computations. The results presented here provide an example of the multi-faceted benefits to an application in a heterogeneous computing environment.

  14. Urban earthquake simulation of Tokyo metropolis using full K computer

    NASA Astrophysics Data System (ADS)

    Fujita, Kohei; Ichimura, Tsuyoshi; Hori, Muneo

    2016-04-01

    Reflecting detailed urban geographic information data to earthquake simulation of cities is expected to improve the reliability of damage estimates for future earthquakes. Such simulations require high resolution computation of large and complex domains and thus fast and scalable finite element solver capable of utilizing supercomputers are needed. Targeting massively parallel scalar supercomputers, we have been developing a fast low-ordered unstructured finite element solver by combining multi-precision arithmetic, multi-grid method, predictors, and techniques for utilizing multi-cores and SIMD units of CPUs. In this talk, I will show the developed method and its scalability/performance on the K computer. Together, I will show some small scale measurement results on Intel Haswell CPU servers for checking performance portability. As an application example, I will show an urban earthquake simulation targeted on a 10 km by 9 km area of central Tokyo with 320 thousand structures. Here the surface ground is modeled by 33 billion elements and 133 billion degrees-of-freedom, and its seismic response is computed using the whole K computer with 82944 compute nodes. The fast and scalable finite element method can be applied to earthquake wave propagation problems through earth crust or elastic/viscoelastic crustal deformation analyses and is expected to be useful for improving resolution of such simulations in the future.

  15. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    PubMed

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies.

  16. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    PubMed

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. PMID:20674066

  17. [Computer simulated images of radiopharmaceutical distributions in anthropomorphic phantoms

    SciTech Connect

    Not Available

    1991-05-17

    We have constructed an anatomically correct human geometry, which can be used to store radioisotope concentrations in 51 various internal organs. Each organ is associated with an index number which references to its attenuating characteristics (composition and density). The initial development of Computer Simulated Images of Radiopharmaceuticals in Anthropomorphic Phantoms (CSIRDAP) over the first 3 years has been very successful. All components of the simulation have been coded, made operational and debugged.

  18. Computer Simulation of Sexual Selection on Age-Structured Populations

    NASA Astrophysics Data System (ADS)

    Martins, S. G. F.; Penna, T. J. P.

    Using computer simulations of a bit-string model for age-structured populations, we found that sexual selection of older males is advantageous, from an evolutionary point of view. These results are in opposition to a recent proposal of females choosing younger males. Our simulations are based on findings from recent studies of polygynous bird species. Since secondary sex characters are found mostly in males, we could make use of asexual populations that can be implemented in a fast and efficient way.

  19. Computer simulations of ions in radio-frequency traps

    NASA Technical Reports Server (NTRS)

    Williams, A.; Prestage, J. D.; Maleki, L.; Djomehri, J.; Harabetian, E.

    1990-01-01

    The motion of ions in a trapped-ion frequency standard affects the stability of the standard. In order to study the motion and structures of large ion clouds in a radio-frequency (RF) trap, a computer simulation of the system that incorporates the effect of thermal excitation of the ions was developed. Results are presented from the simulation for cloud sizes up to 512 ions, emphasizing cloud structures in the low-temperature regime.

  20. A Generic Scheduling Simulator for High Performance Parallel Computers

    SciTech Connect

    Yoo, B S; Choi, G S; Jette, M A

    2001-08-01

    It is well known that efficient job scheduling plays a crucial role in achieving high system utilization in large-scale high performance computing environments. A good scheduling algorithm should schedule jobs to achieve high system utilization while satisfying various user demands in an equitable fashion. Designing such a scheduling algorithm is a non-trivial task even in a static environment. In practice, the computing environment and workload are constantly changing. There are several reasons for this. First, the computing platforms constantly evolve as the technology advances. For example, the availability of relatively powerful commodity off-the-shelf (COTS) components at steadily diminishing prices have made it feasible to construct ever larger massively parallel computers in recent years [1, 4]. Second, the workload imposed on the system also changes constantly. The rapidly increasing compute resources have provided many applications developers with the opportunity to radically alter program characteristics and take advantage of these additional resources. New developments in software technology may also trigger changes in user applications. Finally, political climate change may alter user priorities or the mission of the organization. System designers in such dynamic environments must be able to accurately forecast the effect of changes in the hardware, software, and/or policies under consideration. If the environmental changes are significant, one must also reassess scheduling algorithms. Simulation has frequently been relied upon for this analysis, because other methods such as analytical modeling or actual measurements are usually too difficult or costly. A drawback of the simulation approach, however, is that developing a simulator is a time-consuming process. Furthermore, an existing simulator cannot be easily adapted to a new environment. In this research, we attempt to develop a generic job-scheduling simulator, which facilitates the evaluation of

  1. A Computer Simulation of Community Pharmacy Practice for Educational Use

    PubMed Central

    Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert

    2014-01-01

    Objective. To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. Design. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Assessment. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Conclusion. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor. PMID:26056406

  2. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  3. Computer model to simulate testing at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.

    1995-01-01

    A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.

  4. Modeling emergency department operations using advanced computer simulation systems.

    PubMed

    Saunders, C E; Makens, P K; Leblanc, L J

    1989-02-01

    We developed a computer simulation model of emergency department operations using simulation software. This model uses multiple levels of preemptive patient priority; assigns each patient to an individual nurse and physician; incorporates all standard tests, procedures, and consultations; and allows patient service processes to proceed simultaneously, sequentially, repetitively, or a combination of these. Selected input data, including the number of physicians, nurses, and treatment beds, and the blood test turnaround time, then were varied systematically to determine their simulated effect on patient throughput time, selected queue sizes, and rates of resource utilization. Patient throughput time varied directly with laboratory service times and inversely with the number of physician or nurse servers. Resource utilization rates varied inversely with resource availability, and patient waiting time and patient throughput time varied indirectly with the level of patient acuity. The simulation can be animated on a computer monitor, showing simulated patients, specimens, and staff members moving throughout the ED. Computer simulation is a potentially useful tool that can help predict the results of changes in the ED system without actually altering it and may have implications for planning, optimizing resources, and improving the efficiency and quality of care.

  5. A compositional reservoir simulator on distributed memory parallel computers

    SciTech Connect

    Rame, M.; Delshad, M.

    1995-12-31

    This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. A portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented.

  6. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit

  7. Dynamic computer simulation of the Fort St. Vrain steam turbines

    SciTech Connect

    Conklin, J.C.

    1983-01-01

    A computer simulation is described for the dynamic response of the Fort St. Vrain nuclear reactor regenerative intermediate- and low-pressure steam turbines. The fundamental computer-modeling assumptions for the turbines and feedwater heaters are developed. A turbine heat balance specifying steam and feedwater conditions at a given generator load and the volumes of the feedwater heaters are all that are necessary as descriptive input parameters. Actual plant data for a generator load reduction from 100 to 50% power (which occurred as part of a plant transient on November 9, 1981) are compared with computer-generated predictions, with reasonably good agreement.

  8. AKSATINT - SATELLITE INTERFERENCE ANALYSIS AND SIMULATION USING PERSONAL COMPUTERS

    NASA Technical Reports Server (NTRS)

    Kantak, A.

    1994-01-01

    In the late seventies, the number of communication satellites in service increased, and interference has become an increasingly important consideration in designing satellite/ground station communications systems. Satellite Interference Analysis and Simulation Using Personal Computers, AKSATINT, models the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both the desired and the interfering satellites are considered to be in elliptical orbits. The simulation contains computation of orbital positions of both satellites using classical orbital elements, calculation of the satellite antennae look angles for both satellites and elevation angles at the desired-satellite ground-station antenna, and computation of Doppler effect due to the motions of the satellites and the Earth's rotation. AKSATINT also computes the interference-tosignal-power ratio, taking into account losses suffered by the links. After computing the interference-to-signal-power ratio, the program computes the statistical quantities. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. The program includes a flowchart, a sample run, and results of that run. AKSATINT is expected to be of general use to system designers and frequency managers in selecting the proper frequency under an interference scenario. The AKSATINT program is written in BASIC. It was designed to operate on the IBM Personal Computer AT or compatibles, and has been implemented under MS DOS 3.2. AKSATINT was developed in 1987.

  9. CHARMM additive all-atom force field for carbohydrate derivatives and its utility in polysaccharide and carbohydrate-protein modeling

    PubMed Central

    Guvench, Olgun; Mallajosyula, Sairam S.; Raman, E. Prabhu; Hatcher, Elizabeth; Vanommeslaeghe, Kenno; Foster, Theresa J.; Jamison, Francis W.; MacKerell, Alexander D.

    2011-01-01

    Monosaccharide derivatives such as xylose, fucose, N-acetylglucosamine (GlcNAc), N-acetylgalactosamine (GlaNAc), glucuronic acid, iduronic acid, and N-acetylneuraminic acid (Neu5Ac) are important components of eukaryotic glycans. The present work details development of force-field parameters for these monosaccharides and their covalent connections to proteins via O-linkages to serine or threonine sidechains and via N-linkages to asparagine sidechains. The force field development protocol was designed to explicitly yield parameters that are compatible with the existing CHARMM additive force field for proteins, nucleic acids, lipids, carbohydrates, and small molecules. Therefore, when combined with previously developed parameters for pyranose and furanose monosaccharides, for glycosidic linkages between monosaccharides, and for proteins, the present set of parameters enables the molecular simulation of a wide variety of biologically-important molecules such as complex carbohydrates and glycoproteins. Parametrization included fitting to quantum mechanical (QM) geometries and conformational energies of model compounds, as well as to QM pair interaction energies and distances of model compounds with water. Parameters were validated in the context of crystals of relevant monosaccharides, as well NMR and/or x-ray crystallographic data on larger systems including oligomeric hyaluronan, sialyl Lewis X, O- and N-linked glycopeptides, and a lectin:sucrose complex. As the validated parameters are an extension of the CHARMM all-atom additive biomolecular force field, they further broaden the types of heterogeneous systems accessible with a consistently-developed force-field model. PMID:22125473

  10. Computer simulation of Aphis gossypii insects using Penna aging model

    NASA Astrophysics Data System (ADS)

    Giarola, L. T. P.; Martins, S. G. F.; Toledo Costa, M. C. P.

    2006-08-01

    A computer simulation was made for the population dynamics of Aphis gossypii in laboratory and field conditions. The age structure was inserted in the dynamics through bit string model for biological aging, proposed by Penna in 1995. The influence of different host plants and of climatic factors such as temperature and precipitation was considered in the simulation starting from experimental data. The results obtained indicate that the simulation is an appropriate instrument for understanding of the population dynamics of these species and for the establishment of biological control strategies.

  11. Computer simulation of plasma and N-body problems

    NASA Technical Reports Server (NTRS)

    Harries, W. L.; Miller, J. B.

    1975-01-01

    The following FORTRAN language computer codes are presented: (1) efficient two- and three-dimensional central force potential solvers; (2) a three-dimensional simulator of an isolated galaxy which incorporates the potential solver; (3) a two-dimensional particle-in-cell simulator of the Jeans instability in an infinite self-gravitating compressible gas; and (4) a two-dimensional particle-in-cell simulator of a rotating self-gravitating compressible gaseous system of which rectangular coordinate and superior polar coordinate versions were written.

  12. Computer simulation of multigrid body dynamics and control

    NASA Technical Reports Server (NTRS)

    Swaminadham, M.; Moon, Young I.; Venkayya, V. B.

    1990-01-01

    The objective is to set up and analyze benchmark problems on multibody dynamics and to verify the predictions of two multibody computer simulation codes. TREETOPS and DISCOS have been used to run three example problems - one degree-of-freedom spring mass dashpot system, an inverted pendulum system, and a triple pendulum. To study the dynamics and control interaction, an inverted planar pendulum with an external body force and a torsional control spring was modeled as a hinge connected two-rigid body system. TREETOPS and DISCOS affected the time history simulation of this problem. System state space variables and their time derivatives from two simulation codes were compared.

  13. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments.

  14. A Computer Simulated Experiment in Complex Order Kinetics

    ERIC Educational Resources Information Center

    Merrill, J. C.; And Others

    1975-01-01

    Describes a computer simulation experiment in which physical chemistry students can determine all of the kinetic parameters of a reaction, such as order of the reaction with respect to each reagent, forward and reverse rate constants for the overall reaction, and forward and reverse activation energies. (MLH)

  15. Effectiveness of Computer Simulation for Enhancing Higher Order Thinking.

    ERIC Educational Resources Information Center

    Gokhale, Anu A.

    1996-01-01

    Electronics students (16 controls, 16 experimentals) designed, built, and tested an amplifier. The experimentals did so after it was designed through computer simulation (using Electronics Workbench software). The experimental group performed significantly better on problem-solving tests; both groups did the same on drill and practice tests. (SK)

  16. Computer Simulations and Problem-Solving in Probability.

    ERIC Educational Resources Information Center

    Camp, John S.

    1978-01-01

    The purpose of this paper is to present problems (and solutions) from the areas of marketing, population planning, system reliability, and mathematics to show how a computer simulation can be used as a problem-solving strategy in probability. Examples using BASIC and two methods of generating random numbers are given. (Author/MP)

  17. Social Choice in a Computer-Assisted Simulation

    ERIC Educational Resources Information Center

    Thavikulwat, Precha

    2009-01-01

    Pursuing a line of inquiry suggested by Crookall, Martin, Saunders, and Coote, the author applied, within the framework of design science, an optimal-design approach to incorporate into a computer-assisted simulation two innovative social choice processes: the multiple period double auction and continuous voting. Expectations that the…

  18. Simulations Using a Computer/Videodisc System: Instructional Design Considerations.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    Instructional design considerations involved in using level four videodisc systems when designing simulations are explored. Discussion of the hardware and software system characteristics notes that computer based training offers the features of text, graphics, color, animation, and highlighting techniques, while a videodisc player offers all of…

  19. Computational Simulation of a Water-Cooled Heat Pump

    NASA Technical Reports Server (NTRS)

    Bozarth, Duane

    2008-01-01

    A Fortran-language computer program for simulating the operation of a water-cooled vapor-compression heat pump in any orientation with respect to gravity has been developed by modifying a prior general-purpose heat-pump design code used at Oak Ridge National Laboratory (ORNL).

  20. Improving a Computer Networks Course Using the Partov Simulation Engine

    ERIC Educational Resources Information Center

    Momeni, B.; Kharrazi, M.

    2012-01-01

    Computer networks courses are hard to teach as there are many details in the protocols and techniques involved that are difficult to grasp. Employing programming assignments as part of the course helps students to obtain a better understanding and gain further insight into the theoretical lectures. In this paper, the Partov simulation engine and…

  1. Advanced Simulation and Computing Co-Design Strategy

    SciTech Connect

    Ang, James A.; Hoang, Thuc T.; Kelly, Suzanne M.; McPherson, Allen; Neely, Rob

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  2. Biology Students Building Computer Simulations Using StarLogo TNG

    ERIC Educational Resources Information Center

    Smith, V. Anne; Duncan, Ishbel

    2011-01-01

    Confidence is an important issue for biology students in handling computational concepts. This paper describes a practical in which honours-level bioscience students simulate complex animal behaviour using StarLogo TNG, a freely-available graphical programming environment. The practical consists of two sessions, the first of which guides students…

  3. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; And Others

    1986-01-01

    Describes possible applications of new technologies to special education. Discusses results of a study designed to explore the use of robotics, artificial intelligence, and computer simulations to aid people with handicapping conditions. Presents several scenarios in which specific technological advances may contribute to special education…

  4. Computer Simulation of Small Group Decisions: Model Three.

    ERIC Educational Resources Information Center

    Hare, A.P.; Scheiblechner, Hartmann

    In a test of three computer models to simulate group decisions, data were used from 31 American and Austrian groups on a total of 307 trials. The task for each group was to predict a series of answers of an unknown subject on a value-orientation questionnaire, after being given a sample of his typical responses. The first model, used the mean of…

  5. Role of Computer Graphics in Simulations for Teaching Physiology.

    ERIC Educational Resources Information Center

    Modell, H. I.; And Others

    1983-01-01

    Discusses a revision of existing respiratory physiology simulations to promote active learning experiences for individual students. Computer graphics were added to aid student's conceptualization of the physiological system. Specific examples are provided, including those dealing with alveolar gas equations and effects of anatomic shunt flow on…

  6. Systematic error analysis of rotating coil using computer simulation

    SciTech Connect

    Li, Wei-chuan; Coles, M.

    1993-04-01

    This report describes a study of the systematic and random measurement uncertainties of magnetic multipoles which are due to construction errors, rotational speed variation, and electronic noise in a digitally bucked tangential coil assembly with dipole bucking windings. The sensitivities of the systematic multipole uncertainty to construction errors are estimated analytically and using a computer simulation program.

  7. Highway traffic simulation on multi-processor computers

    SciTech Connect

    Hanebutte, U.R.; Doss, E.; Tentner, A.M.

    1997-04-01

    A computer model has been developed to simulate highway traffic for various degrees of automation with a high level of fidelity in regard to driver control and vehicle characteristics. The model simulates vehicle maneuvering in a multi-lane highway traffic system and allows for the use of Intelligent Transportation System (ITS) technologies such as an Automated Intelligent Cruise Control (AICC). The structure of the computer model facilitates the use of parallel computers for the highway traffic simulation, since domain decomposition techniques can be applied in a straight forward fashion. In this model, the highway system (i.e. a network of road links) is divided into multiple regions; each region is controlled by a separate link manager residing on an individual processor. A graphical user interface augments the computer model kv allowing for real-time interactive simulation control and interaction with each individual vehicle and road side infrastructure element on each link. Average speed and traffic volume data is collected at user-specified loop detector locations. Further, as a measure of safety the so- called Time To Collision (TTC) parameter is being recorded.

  8. Robotics, Artificial Intelligence, Computer Simulation: Future Applications in Special Education.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; And Others

    The report describes three advanced technologies--robotics, artificial intelligence, and computer simulation--and identifies the ways in which they might contribute to special education. A hybrid methodology was employed to identify existing technology and forecast future needs. Following this framework, each of the technologies is defined,…

  9. Accelerating sino-atrium computer simulations with graphic processing units.

    PubMed

    Zhang, Hong; Xiao, Zheng; Lin, Shien-fong

    2015-01-01

    Sino-atrial node cells (SANCs) play a significant role in rhythmic firing. To investigate their role in arrhythmia and interactions with the atrium, computer simulations based on cellular dynamic mathematical models are generally used. However, the large-scale computation usually makes research difficult, given the limited computational power of Central Processing Units (CPUs). In this paper, an accelerating approach with Graphic Processing Units (GPUs) is proposed in a simulation consisting of the SAN tissue and the adjoining atrium. By using the operator splitting method, the computational task was made parallel. Three parallelization strategies were then put forward. The strategy with the shortest running time was further optimized by considering block size, data transfer and partition. The results showed that for a simulation with 500 SANCs and 30 atrial cells, the execution time taken by the non-optimized program decreased 62% with respect to a serial program running on CPU. The execution time decreased by 80% after the program was optimized. The larger the tissue was, the more significant the acceleration became. The results demonstrated the effectiveness of the proposed GPU-accelerating methods and their promising applications in more complicated biological simulations. PMID:26406070

  10. Computers With Wings: Flight Simulation and Personalized Landscapes

    NASA Astrophysics Data System (ADS)

    Oss, Stefano

    2005-03-01

    We propose, as a special way to explore the physics of flying objects, to use a flight simulator with a personalized scenery to reproduce the territory where students live. This approach increases the participation and attention of students to physics classes but also creates several opportunities for addressing side activities and arguments of various nature, from history to geography, computer science, and much more.

  11. Simulation of a National Computer Network in a Gaming Environment

    ERIC Educational Resources Information Center

    Segal, Ronald; O'Neal, Beverly

    1978-01-01

    A national computer services network simulation model was used in a 3-day gaming exercise involving 16 institutional teams who made decisions about their likely long-term network participation. Participants were able to react to others' decisions and actions, and to critical overriding political, economical, and organizational issues. (CMV)

  12. Modeling and Computer Simulation of AN Insurance Policy:

    NASA Astrophysics Data System (ADS)

    Acharyya, Muktish; Acharyya, Ajanta Bhowal

    We have developed a model for a life-insurance policy. In this model, the net gain is calculated by computer simulation for a particular type of lifetime distribution function. We observed that the net gain becomes maximum for a particular value of upper age for last premium.

  13. Computational Modelling and Simulation Fostering New Approaches in Learning Probability

    ERIC Educational Resources Information Center

    Kuhn, Markus; Hoppe, Ulrich; Lingnau, Andreas; Wichmann, Astrid

    2006-01-01

    Discovery learning in mathematics in the domain of probability based on hands-on experiments is normally limited because of the difficulty in providing sufficient materials and data volume in terms of repetitions of the experiments. Our cooperative, computational modelling and simulation environment engages students and teachers in composing and…

  14. Time Advice and Learning Questions in Computer Simulations

    ERIC Educational Resources Information Center

    Rey, Gunter Daniel

    2011-01-01

    Students (N = 101) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without time advice) x 3 (with learning questions and corrective feedback, with…

  15. Accelerating sino-atrium computer simulations with graphic processing units.

    PubMed

    Zhang, Hong; Xiao, Zheng; Lin, Shien-fong

    2015-01-01

    Sino-atrial node cells (SANCs) play a significant role in rhythmic firing. To investigate their role in arrhythmia and interactions with the atrium, computer simulations based on cellular dynamic mathematical models are generally used. However, the large-scale computation usually makes research difficult, given the limited computational power of Central Processing Units (CPUs). In this paper, an accelerating approach with Graphic Processing Units (GPUs) is proposed in a simulation consisting of the SAN tissue and the adjoining atrium. By using the operator splitting method, the computational task was made parallel. Three parallelization strategies were then put forward. The strategy with the shortest running time was further optimized by considering block size, data transfer and partition. The results showed that for a simulation with 500 SANCs and 30 atrial cells, the execution time taken by the non-optimized program decreased 62% with respect to a serial program running on CPU. The execution time decreased by 80% after the program was optimized. The larger the tissue was, the more significant the acceleration became. The results demonstrated the effectiveness of the proposed GPU-accelerating methods and their promising applications in more complicated biological simulations.

  16. Symbolic Quantum Computation Simulation in SymPy

    NASA Astrophysics Data System (ADS)

    Cugini, Addison; Curry, Matt; Granger, Brian

    2010-10-01

    Quantum computing is an emerging field which aims to use quantum mechanics to solve difficult computational problems with greater efficiency than on a classical computer. There is a need to create software that i) helps newcomers to learn the field, ii) enables practitioners to design and simulate quantum circuits and iii) provides an open foundation for further research in the field. Towards these ends we have created a package, in the open-source symbolic computation library SymPy, that simulates the quantum circuit model of quantum computation using Dirac notation. This framework builds on the extant powerful symbolic capabilities of SymPy to preform its simulations in a fully symbolic manner. We use object oriented design to abstract circuits as ordered collections of quantum gate and qbit objects. The gate objects can either be applied directly to the qbit objects or be represented as matrices in different bases. The package is also capable of performing the quantum Fourier transform and Shor's algorithm. A notion of measurement is made possible through the use of a non-commutative gate object. In this talk, we describe the software and show examples of quantum circuits on single and multi qbit states that involve common algorithms, gates and measurements.

  17. Computer-simulated development process of Chinese characters font cognition

    NASA Astrophysics Data System (ADS)

    Chen, Jing; Mu, Zhichun; Sun, Dehui; Hu, Dunli

    2008-10-01

    The research of Chinese characters cognition is an important research aspect of cognitive science and computer science, especially artificial intelligence. In this paper, according as the traits of Chinese characters the database of Chinese characters font representations and the model of computer simulation of Chinese characters font cognition are constructed from the aspect of cognitive science. The font cognition of Chinese characters is actual a gradual process and there is the accumulation of knowledge. Through using the method of computer simulation, the development model of Chinese characters cognition was constructed. And this is the important research content of Chinese characters cognition. This model is based on self-organizing neural network and adaptive resonance theory (ART) neural network. By Combining the SOFM and ART2 network, two sets of input were trained. Through training and testing methods, the development process of Chinese characters cognition based on Chinese characters cognition was simulated. Then the results from this model and could be compared with the results that were obtained only using SOFM. By analyzing the results, this simulation suggests that the model is able to account for some empirical results. So, the model can simulate the development process of Chinese characters cognition in a way.

  18. Computer simulations of isolated conductors in electrostatic equilibrium.

    PubMed

    Chang, Herng-Hua

    2008-11-01

    A computer simulation model is introduced to study the characteristics of isolated conductors in electrostatic equilibrium. Drawing an analogy between electrons and how they move to the surface of isolated conductors, we randomly initialize a large number of particles inside a small region at the center of simulated conductors and advance them according to their forces of repulsion. By use of optimized numerical techniques of the finite-size particle method associated with Poisson's equation, the particles are quickly advanced using a fast Fourier transform and their charge is efficiently shared using the clouds-in-cells method. The particle populations in the simulations range from 50x10;{3} to 1x10;{6} that move in various computation domains equal to 128x128 , 256x256 , and 512x512 grids. When the particles come to an electrostatic equilibrium, they lie on the boundaries of the simulated conductors, from which the equilibrium properties are obtained. Consistent with the theory of electrostatics and charged conductors, we found that the particles move in response to the conductor geometry in such a way that the electrostatic energy is minimized. Good approximation results for the equilibrium properties were obtained using the proposed computer simulation model.

  19. Computer Simulation Of Radiographic Screen-Film Images

    NASA Astrophysics Data System (ADS)

    Metter, Richard V.; Dillon, Peter L.; Huff, Kenneth E.; Rabbani, Majid

    1986-06-01

    A method is described for computer simulation of radiographic screen-film images. This method is based on a previously published model of the screen-film imaging process.l The x-ray transmittance of a test object is sampled at a pitch of 50 μm by scanning a high-resolution, low-noise direct-exposure radiograph. This transmittance is then used, along with the x-ray exposure incident upon the object, to determine the expected number of quanta per pixel incident upon the screen. The random nature of x-ray arrival and absorption, x-ray quantum to light photon conversion, and photon absorption by the film is simulated by appropriate random number generation. Standard FFT techniques are used for computing the effects of scattering. Finally, the computed film density for each pixel is produced on a high-resolution, low-noise output film by a scanning printer. The simulation allows independent specification of x-ray exposure, x-ray quantum absorption, light conversion statistics, light scattering, and film characteristics (sensitometry and gran-ularity). Each of these parameters is independently measured for radiographic systems of interest. The simulator is tested by comparing actual radiographic images with simulated images resulting from the independently measured parameters. Images are also shown illustrating the effects of changes in these parameters on image quality. Finally, comparison is made with a "perfect" imaging system where information content is only limited by the finite number of x-rays.

  20. Bibliography for Verification and Validation in Computational Simulations

    SciTech Connect

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  1. Computational simulation of high temperature metal matrix composites cyclic behavior

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Hopkins, D. A.

    1988-01-01

    A procedure was developed and is described which can be used to computationally simulate the cyclic behavior of high temperature metal matrix composites (HTMMC) and its degradation effects on the structural response. This procedure consists of HTMMC mechanics coupled with a multifactor interaction constituent material relationship and with an incremental iterative nonlinear analysis. The procedure is implemented in a computer code that can be used to computationally simulate the thermomechanical behavior of HTMMC starting from the fabrication process and proceeding through thermomechanical cycling, accounting for the interface/interphase region. Results show that combined thermal/mechanical cycling, the interphase, and in situ matrix properties have significant effects on the structural integrity of HTMMC.

  2. Computer simulation for hormones related to primary thyropathy.

    PubMed

    Hatakeyama, T; Yagi, H

    1985-01-01

    We propose a mathematical model of the human hypothalamus-anterior pituitary-thyroid system regulating basal metabolism, and practice computer simulation concerning primary thyropathy such as Graves' disease, hypothyroidism, T4-toxicosis and T3-toxicosis by use of this model. In order to throw light on properties of the system, indicial responses of the hormones, T4, T3, rT3, and TSH, and the function of the thyroid gland are computed. Medical treatments for Graves' disease and for hypothyroidism are simulated with a view to enhancing clinical significance. Performance of the simulation leads to an interesting result that when the convertion rate of blood T4 to blood T3 increases, explicit T3-toxicosis occurs, although the function of the thyroid gland is normal.

  3. Paediatric bed fall computer simulation model development and validation.

    PubMed

    Thompson, Angela K; Bertocci, Gina E

    2013-01-01

    Falls from beds and other household furniture are common scenarios stated to conceal child abuse. Knowledge of the biomechanics associated with short-distance falls may aid clinicians in distinguishing between abusive and accidental injuries. Computer simulation is a useful tool to investigate injury-producing events and to study the effect of altering event parameters on injury risk. In this study, a paediatric bed fall computer simulation model was developed and validated. The simulation was created using Mathematical Dynamic Modeling(®) software with a child restraint air bag interaction (CRABI) 12-month-old anthropomorphic test device (ATD) representing the fall victim. The model was validated using data from physical fall experiments of the same scenario with an instrumented CRABI ATD. Validation was conducted using both observational and statistical comparisons. Future parametric sensitivity studies using this model will lead to an improved understanding of relationships between child (fall victim) parameters, fall environment parameters and injury potential.

  4. The equilibrium properties and folding kinetics of an all-atom Go xAF model of the Trp-cage

    NASA Astrophysics Data System (ADS)

    Linhananta, Apichart; Boer, Jesse; MacKay, Ian

    2005-03-01

    The ultrafast-folding 20-residue Trp-cage protein is quickly becoming a new benchmark for molecular dynamics studies. Already several all-atom simulations have probed its equilibrium and kinetic properties. In this work an all-atom Go ¯ model is used to accurately represent the side-chain packing and native atomic contacts of the Trp-cage. The model reproduces the hallmark thermodynamics cooperativity of small proteins. Folding simulations observe that in the fast-folding dominant pathway, partial α-helical structure forms before hydrophobic core collapse. In the slow-folding secondary pathway, partial core collapse occurs before helical structure. The slow-folding rate of the secondary pathway is attributed to the loss of side-chain rotational freedom, due to the early core collapse, which impedes the helix formation. A major finding is the observation of a low-temperature kinetic intermediate stabilized by a salt bridge between residues Asp-9 and Arg-16. Similar observations [R. Zhou, Proc. Natl. Acad. Sci. U.S.A. 100, 13280 (2003)] were reported in a recent study using an all-atom model of the Trp-cage in explicit water, in which the salt-bridge stabilized intermediate was hypothesized to be the origin of the ultrafast-folding mechanism. A theoretical mutation that eliminates the Asp-9-Arg-16 salt bridge, but leaves the residues intact, is performed. Folding simulations of the mutant Trp-cage observe a two-state free-energy landscape with no kinetic intermediate and a significant decrease in the folding rate, in support of the hypothesis.

  5. A real-time all-atom structural search engine for proteins.

    PubMed

    Gonzalez, Gabriel; Hannigan, Brett; DeGrado, William F

    2014-07-01

    Protein designers use a wide variety of software tools for de novo design, yet their repertoire still lacks a fast and interactive all-atom search engine. To solve this, we have built the Suns program: a real-time, atomic search engine integrated into the PyMOL molecular visualization system. Users build atomic-level structural search queries within PyMOL and receive a stream of search results aligned to their query within a few seconds. This instant feedback cycle enables a new "designability"-inspired approach to protein design where the designer searches for and interactively incorporates native-like fragments from proven protein structures. We demonstrate the use of Suns to interactively build protein motifs, tertiary interactions, and to identify scaffolds compatible with hot-spot residues. The official web site and installer are located at http://www.degradolab.org/suns/ and the source code is hosted at https://github.com/godotgildor/Suns (PyMOL plugin, BSD license), https://github.com/Gabriel439/suns-cmd (command line client, BSD license), and https://github.com/Gabriel439/suns-search (search engine server, GPLv2 license).

  6. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    SciTech Connect

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  7. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    NASA Astrophysics Data System (ADS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual

  8. Computational performance of a smoothed particle hydrodynamics simulation for shared-memory parallel computing

    NASA Astrophysics Data System (ADS)

    Nishiura, Daisuke; Furuichi, Mikito; Sakaguchi, Hide

    2015-09-01

    The computational performance of a smoothed particle hydrodynamics (SPH) simulation is investigated for three types of current shared-memory parallel computer devices: many integrated core (MIC) processors, graphics processing units (GPUs), and multi-core CPUs. We are especially interested in efficient shared-memory allocation methods for each chipset, because the efficient data access patterns differ between compute unified device architecture (CUDA) programming for GPUs and OpenMP programming for MIC processors and multi-core CPUs. We first introduce several parallel implementation techniques for the SPH code, and then examine these on our target computer architectures to determine the most effective algorithms for each processor unit. In addition, we evaluate the effective computing performance and power efficiency of the SPH simulation on each architecture, as these are critical metrics for overall performance in a multi-device environment. In our benchmark test, the GPU is found to produce the best arithmetic performance as a standalone device unit, and gives the most efficient power consumption. The multi-core CPU obtains the most effective computing performance. The computational speed of the MIC processor on Xeon Phi approached that of two Xeon CPUs. This indicates that using MICs is an attractive choice for existing SPH codes on multi-core CPUs parallelized by OpenMP, as it gains computational acceleration without the need for significant changes to the source code.

  9. Computational simulation for analysis and synthesis of impact resilient structure

    NASA Astrophysics Data System (ADS)

    Djojodihardjo, Harijono

    2013-10-01

    Impact resilient structures are of great interest in many engineering applications varying from civil, land vehicle, aircraft and space structures, to mention a few examples. To design such structure, one has to resort fundamental principles and take into account progress in analytical and computational approaches as well as in material science and technology. With such perspectives, this work looks at a generic beam and plate structure subject to impact loading and carry out analysis and numerical simulation. The first objective of the work is to develop a computational algorithm to analyze flat plate as a generic structure subjected to impact loading for numerical simulation and parametric study. The analysis will be based on dynamic response analysis. Consideration is given to the elastic-plastic region. The second objective is to utilize the computational algorithm for direct numerical simulation, and as a parallel scheme, commercial off-the shelf numerical code is utilized for parametric study, optimization and synthesis. Through such analysis and numerical simulation, effort is devoted to arrive at an optimum configuration in terms of loading, structural dimensions, material properties and composite lay-up, among others. Results will be discussed in view of practical applications.

  10. Cosmic Reionization on Computers. I. Design and Calibration of Simulations

    NASA Astrophysics Data System (ADS)

    Gnedin, Nickolay Y.

    2014-09-01

    Cosmic Reionization On Computers is a long-term program of numerical simulations of cosmic reionization. Its goal is to model fully self-consistently (albeit not necessarily from the first principles) all relevant physics, from radiative transfer to gas dynamics and star formation, in simulation volumes of up to 100 comoving Mpc, and with spatial resolution approaching 100 pc in physical units. In this method paper, we describe our numerical method, the design of simulations, and the calibration of numerical parameters. Using several sets (ensembles) of simulations in 20 h -1 Mpc and 40 h -1 Mpc boxes with spatial resolution reaching 125 pc at z = 6, we are able to match the observed galaxy UV luminosity functions at all redshifts between 6 and 10, as well as obtain reasonable agreement with the observational measurements of the Gunn-Peterson optical depth at z < 6.

  11. COMPUTER MODEL AND SIMULATION OF A GLOVE BOX PROCESS

    SciTech Connect

    C. FOSTER; ET AL

    2001-01-01

    The development of facilities to deal with the disposition of nuclear materials at an acceptable level of Occupational Radiation Exposure (ORE) is a significant issue facing the nuclear community. One solution is to minimize the worker's exposure though the use of automated systems. However, the adoption of automated systems for these tasks is hampered by the challenging requirements that these systems must meet in order to be cost effective solutions in the hazardous nuclear materials processing environment. Retrofitting current glove box technologies with automation systems represents potential near-term technology that can be applied to reduce worker ORE associated with work in nuclear materials processing facilities. Successful deployment of automation systems for these applications requires the development of testing and deployment strategies to ensure the highest level of safety and effectiveness. Historically, safety tests are conducted with glove box mock-ups around the finished design. This late detection of problems leads to expensive redesigns and costly deployment delays. With wide spread availability of computers and cost effective simulation software it is possible to discover and fix problems early in the design stages. Computer simulators can easily create a complete model of the system allowing a safe medium for testing potential failures and design shortcomings. The majority of design specification is now done on computer and moving that information to a model is relatively straightforward. With a complete model and results from a Failure Mode Effect Analysis (FMEA), redesigns can be worked early. Additional issues such as user accessibility, component replacement, and alignment problems can be tackled early in the virtual environment provided by computer simulation. In this case, a commercial simulation package is used to simulate a lathe process operation at the Los Alamos National Laboratory (LANL). The Lathe process operation is indicative of

  12. Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design

    NASA Technical Reports Server (NTRS)

    Tweedt, Daniel L.

    2010-01-01

    Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.

  13. Multi-dimensional computer simulation of MHD combustor hydrodynamics

    SciTech Connect

    Berry, G.F.; Chang, S.L.; Lottes, S.A.; Rimkus, W.A.

    1991-04-04

    Argonne National Laboratory is investigating the nonreacting jet-gas mixing patterns in an MHD second stage combustor by using a two-dimensional multi-phase hydrodynamics computer program and a three-dimensional single-phase hydrodynamics computer program. The computer simulations are intended to enhance the understanding of flow and mixing patterns in the combustor, which in turn may lead to improvement of the downstream MHD channel performance. A two-dimensional steady state computer model, based on mass and momentum conservation laws for multiple gas species, is used to simulate the hydrodynamics of the combustor in which a jet of oxidizer is injected into an unconfined cross-stream gas flow. A three-dimensional code is used to examine the effects of the side walls and the distributed jet flows on the non-reacting jet-gas mixing patterns. The code solves the conservation equations of mass, momentum, and energy, and a transport equation of a turbulence parameter and allows permeable surfaces to be specified for any computational cell. 17 refs., 25 figs.

  14. Development of magnetron sputtering simulator with GPU parallel computing

    NASA Astrophysics Data System (ADS)

    Sohn, Ilyoup; Kim, Jihun; Bae, Junkyeong; Lee, Jinpil

    2014-12-01

    Sputtering devices are widely used in the semiconductor and display panel manufacturing process. Currently, a number of surface treatment applications using magnetron sputtering techniques are being used to improve the efficiency of the sputtering process, through the installation of magnets outside the vacuum chamber. Within the internal space of the low pressure chamber, plasma generated from the combination of a rarefied gas and an electric field is influenced interactively. Since the quality of the sputtering and deposition rate on the substrate is strongly dependent on the multi-physical phenomena of the plasma regime, numerical simulations using PIC-MCC (Particle In Cell, Monte Carlo Collision) should be employed to develop an efficient sputtering device. In this paper, the development of a magnetron sputtering simulator based on the PIC-MCC method and the associated numerical techniques are discussed. To solve the electric field equations in the 2-D Cartesian domain, a Poisson equation solver based on the FDM (Finite Differencing Method) is developed and coupled with the Monte Carlo Collision method to simulate the motion of gas particles influenced by an electric field. The magnetic field created from the permanent magnet installed outside the vacuum chamber is also numerically calculated using Biot-Savart's Law. All numerical methods employed in the present PIC code are validated by comparison with analytical and well-known commercial engineering software results, with all of the results showing good agreement. Finally, the developed PIC-MCC code is parallelized to be suitable for general purpose computing on graphics processing unit (GPGPU) acceleration, so as to reduce the large computation time which is generally required for particle simulations. The efficiency and accuracy of the GPGPU parallelized magnetron sputtering simulator are examined by comparison with the calculated results and computation times from the original serial code. It is found that

  15. A distributed computing tool for generating neural simulation databases.

    PubMed

    Calin-Jageman, Robert J; Katz, Paul S

    2006-12-01

    After developing a model neuron or network, it is important to systematically explore its behavior across a wide range of parameter values or experimental conditions, or both. However, compiling a very large set of simulation runs is challenging because it typically requires both access to and expertise with high-performance computing facilities. To lower the barrier for large-scale model analysis, we have developed NeuronPM, a client/server application that creates a "screen-saver" cluster for running simulations in NEURON (Hines & Carnevale, 1997). NeuronPM provides a user-friendly way to use existing computing resources to catalog the performance of a neural simulation across a wide range of parameter values and experimental conditions. The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. During idle time, the client retrieves model files and work assignments from the server, invokes NEURON to run the simulation, and returns results to the server. Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net . It is a useful entry-level tool for systematically analyzing complex neuron and network simulations.

  16. Computer Science Techniques Applied to Parallel Atomistic Simulation

    NASA Astrophysics Data System (ADS)

    Nakano, Aiichiro

    1998-03-01

    Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.

  17. The very local Hubble flow: Computer simulations of dynamical history

    NASA Astrophysics Data System (ADS)

    Chernin, A. D.; Karachentsev, I. D.; Valtonen, M. J.; Dolgachev, V. P.; Domozhilova, L. M.; Makarov, D. I.

    2004-02-01

    The phenomenon of the very local (≤3 Mpc) Hubble flow is studied on the basis of the data of recent precision observations. A set of computer simulations is performed to trace the trajectories of the flow galaxies back in time to the epoch of the formation of the Local Group. It is found that the ``initial conditions'' of the flow are drastically different from the linear velocity-distance relation. The simulations enable one also to recognize the major trends of the flow evolution and identify the dynamical role of universal antigravity produced by the cosmic vacuum.

  18. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    SciTech Connect

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  19. A computer simulation approach to measurement of human control strategy

    NASA Technical Reports Server (NTRS)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  20. Computer simulations of the motion and decay of global strings

    SciTech Connect

    Hagmann, C.; Sikivie, P.

    1990-01-01

    Computer simulations have been carried out of the motion and decay of global strings, including spectrum analysis of the energy stored in the scalar field which describes the global string and the radiated Nambu-Goldstone bosons. We simulated relaxing pieces of bent string and collapsing loops. We find, for the string sizes investigated, that the spectrum of field energy hardens rather than softens while the string decays into Nambu-Goldstone radiation. We argue on theoretical grounds that is indeed the most plausible general behaviour. 19 refs., 12 figs.

  1. Computer simulation of a general purpose satellite modem

    NASA Astrophysics Data System (ADS)

    Montgomery, William L., Jr.

    1992-12-01

    The performance of a digital phase shift keyed satellite modem was modeled and simulated. The probability of bit error (P(sub b)) at different levels of energy per bit to noise power ratio (E( sub b)/N(sub o)) was the performance measure. The channel was assumed to contribute only additive white Gaussian noise. A second order Costas loop performs demodulation in the modem and was the key part of the simulation. The Costas loop with second order Butterworth arm filters was tested by finding the response to a phase or frequency step. The Costas loop response was found to be in agreement with theoretical predictions in the absence of noise. Finally, the effect on P(sub b) of a rate 1/2 constraint length 7 convolutional code with eight level soft Viterbi decoding was demonstrated by the simulation. The simulation results were within 0.7 dB of theoretical. All computer simulations were done at baseband to reduce simulation times. The Monte Carlo error counting technique was used to estimate P(sub b). The effect of increasing the samples per bit in the simulation was demonstrated by the 0.4 dB improvement in P(sub b) caused by doubling the number of samples.

  2. All-atom semiclassical dynamics study of quantum coherence in photosynthetic Fenna-Matthews-Olson complex.

    PubMed

    Kim, Hyun Woo; Kelly, Aaron; Park, Jae Woo; Rhee, Young Min

    2012-07-18

    Although photosynthetic pigment-protein complexes are in noisy environments, recent experimental and theoretical results indicate that their excitation energy transfer (EET) can exhibit coherent characteristics for over hundreds of femtoseconds. Despite the almost universal observations of the coherence to some degree, questions still remain regarding the detailed role of the protein and the extent of high-temperature coherence. Here we adopt a theoretical method that incorporates an all-atom description of the photosynthetic complex within a semiclassical framework in order to study EET in the Fenna-Matthews-Olson complex. We observe that the vibrational modes of the chromophore tend to diminish the coherence at the ensemble level, yet much longer-lived coherences may be observed at the single-complex level. We also observe that coherent oscillations in the site populations also commence within tens of femtoseconds even when the system is initially prepared in a non-oscillatory stationary state. We show that the protein acts to maintain the electronic couplings among the system of embedded chromophores. We also investigate the extent to which the protein's electrostatic modulation that disperses the chromophore electronic energies may affect the coherence lifetime. Further, we observe that even though mutation-induced disruptions in the protein structure may change the coupling pattern, a relatively strong level of coupling and associated coherence in the dynamics still remain. Finally, we demonstrate that thermal fluctuations in the chromophore couplings induce some redundancy in the coherent energy-transfer pathway. Our results indicate that a description of both chromophore coupling strengths and their fluctuations is crucial to better understand coherent EET processes in photosynthetic systems. PMID:22708971

  3. RNABC: forward kinematics to reduce all-atom steric clashes in RNA backbone.

    PubMed

    Wang, Xueyi; Kapral, Gary; Murray, Laura; Richardson, David; Richardson, Jane; Snoeyink, Jack

    2008-01-01

    Although accurate details in RNA structure are of great importance for understanding RNA function, the backbone conformation is difficult to determine, and most existing RNA structures show serious steric clashes (>or= 0.4 A overlap) when hydrogen atoms are taken into account. We have developed a program called RNABC (RNA Backbone Correction) that performs local perturbations to search for alternative conformations that avoid those steric clashes or other local geometry problems. Its input is an all-atom coordinate file for an RNA crystal structure (usually from the MolProbity web service), with problem areas specified. RNABC rebuilds a suite (the unit from sugar to sugar) by anchoring the phosphorus and base positions, which are clearest in crystallographic electron density, and reconstructing the other atoms using forward kinematics. Geometric parameters are constrained within user-specified tolerance of canonical or original values, and torsion angles are constrained to ranges defined through empirical database analyses. Several optimizations reduce the time required to search the many possible conformations. The output results are clustered and presented to the user, who can choose whether to accept one of the alternative conformations. Two test evaluations show the effectiveness of RNABC, first on the S-motifs from 42 RNA structures, and second on the worst problem suites (clusters of bad clashes, or serious sugar pucker outliers) in 25 unrelated RNA structures. Among the 101 S-motifs, 88 had diagnosed problems, and RNABC produced clash-free conformations with acceptable geometry for 71 of those (about 80%). For the 154 worst problem suites, RNABC proposed alternative conformations for 72. All but 8 of those were judged acceptable after examining electron density (where available) and local conformation. Thus, even for these worst cases, nearly half the time RNABC suggested corrections suitable to initiate further crystallographic refinement. The program is

  4. MolProbity: all-atom structure validation for macromolecular crystallography

    SciTech Connect

    Chen, Vincent B.; Arendall, W. Bryan III; Headd, Jeffrey J.; Keedy, Daniel A.; Immormino, Robert M.; Kapral, Gary J.; Murray, Laura W.; Richardson, Jane S.; Richardson, David C.

    2010-01-01

    MolProbity structure validation will diagnose most local errors in macromolecular crystal structures and help to guide their correction. MolProbity is a structure-validation web service that provides broad-spectrum solidly based evaluation of model quality at both the global and local levels for both proteins and nucleic acids. It relies heavily on the power and sensitivity provided by optimized hydrogen placement and all-atom contact analysis, complemented by updated versions of covalent-geometry and torsion-angle criteria. Some of the local corrections can be performed automatically in MolProbity and all of the diagnostics are presented in chart and graphical forms that help guide manual rebuilding. X-ray crystallography provides a wealth of biologically important molecular data in the form of atomic three-dimensional structures of proteins, nucleic acids and increasingly large complexes in multiple forms and states. Advances in automation, in everything from crystallization to data collection to phasing to model building to refinement, have made solving a structure using crystallography easier than ever. However, despite these improvements, local errors that can affect biological interpretation are widespread at low resolution and even high-resolution structures nearly all contain at least a few local errors such as Ramachandran outliers, flipped branched protein side chains and incorrect sugar puckers. It is critical both for the crystallographer and for the end user that there are easy and reliable methods to diagnose and correct these sorts of errors in structures. MolProbity is the authors’ contribution to helping solve this problem and this article reviews its general capabilities, reports on recent enhancements and usage, and presents evidence that the resulting improvements are now beneficially affecting the global database.

  5. All-atom 3D structure prediction of transmembrane β-barrel proteins from sequences

    PubMed Central

    Hayat, Sikander; Sander, Chris; Marks, Debora S.

    2015-01-01

    Transmembrane β-barrels (TMBs) carry out major functions in substrate transport and protein biogenesis but experimental determination of their 3D structure is challenging. Encouraged by successful de novo 3D structure prediction of globular and α-helical membrane proteins from sequence alignments alone, we developed an approach to predict the 3D structure of TMBs. The approach combines the maximum-entropy evolutionary coupling method for predicting residue contacts (EVfold) with a machine-learning approach (boctopus2) for predicting β-strands in the barrel. In a blinded test for 19 TMB proteins of known structure that have a sufficient number of diverse homologous sequences available, this combined method (EVfold_bb) predicts hydrogen-bonded residue pairs between adjacent β-strands at an accuracy of ∼70%. This accuracy is sufficient for the generation of all-atom 3D models. In the transmembrane barrel region, the average 3D structure accuracy [template-modeling (TM) score] of top-ranked models is 0.54 (ranging from 0.36 to 0.85), with a higher (44%) number of residue pairs in correct strand–strand registration than in earlier methods (18%). Although the nonbarrel regions are predicted less accurately overall, the evolutionary couplings identify some highly constrained loop residues and, for FecA protein, the barrel including the structure of a plug domain can be accurately modeled (TM score = 0.68). Lower prediction accuracy tends to be associated with insufficient sequence information and we therefore expect increasing numbers of β-barrel families to become accessible to accurate 3D structure prediction as the number of available sequences increases. PMID:25858953

  6. Development of computer simulations for landfill methane recovery

    SciTech Connect

    Massmann, J.W.; Moore, C.A.; Sykes, R.M.

    1981-12-01

    Two- and three-dimensional finite-difference computer programs simulating methane recovery systems in landfills have been developed. These computer programs model multicomponent combined pressure and diffusional flow in porous media. Each program and the processes it models are described in this report. Examples of the capabilities of each program are also presented. The two-dimensional program was used to simulate methane recovery systems in a cylindrically shaped landfill. The effects of various pump locations, geometries, and extraction rates were determined. The three-dimensional program was used to model the Puente Hills landfill, a field test site in southern California. The biochemical and microbiological details of methane generation in landfills are also given. Effects of environmental factors, such as moisture, oxygen, temperature, and nutrients on methane generation are discussed and an analytical representation of the gas generation rate is developed.

  7. Using Computer Simulation for Neurolab 2 Mission Planning

    NASA Technical Reports Server (NTRS)

    Sanders, Betty M.

    1997-01-01

    This paper presents an overview of the procedure used in the creation of a computer simulation video generated by the Graphics Research and Analysis Facility at NASA/Johnson Space Center. The simulation was preceded by an analysis of anthropometric characteristics of crew members and workspace requirements for 13 experiments to be conducted on Neurolab 2 which is dedicated to neuroscience and behavioral research. Neurolab 2 is being carried out as a partnership among national domestic research institutes and international space agencies. The video is a tour of the Spacelab module as it will be configured for STS-90, scheduled for launch in the spring of 1998, and identifies experiments that can be conducted in parallel during that mission. Therefore, this paper will also address methods for using computer modeling to facilitate the mission planning activity.

  8. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  9. Adding computationally efficient realism to Monte Carlo turbulence simulation

    NASA Technical Reports Server (NTRS)

    Campbell, C. W.

    1985-01-01

    Frequently in aerospace vehicle flight simulation, random turbulence is generated using the assumption that the craft is small compared to the length scales of turbulence. The turbulence is presumed to vary only along the flight path of the vehicle but not across the vehicle span. The addition of the realism of three-dimensionality is a worthy goal, but any such attempt will not gain acceptance in the simulator community unless it is computationally efficient. A concept for adding three-dimensional realism with a minimum of computational complexity is presented. The concept involves the use of close rational approximations to irrational spectra and cross-spectra so that systems of stable, explicit difference equations can be used to generate the turbulence.

  10. Computer simulations for the adsorption of polymers onto surfaces

    SciTech Connect

    Balazs, A.C.

    1993-01-01

    Polymer-surface interactions are important in every stage of oil and coal production, production of new energy-efficient composite materials, and in medicine. Therefore, it is important to isolate the factors that influence the interfacial activity of polymer chains. We developed theoretical models and computer simulations to determine effects of polymer architecture, solvent quality, and surface morphology on properties of chains at penetrable and impenetrable interfaces. 7 figs, 27 refs.

  11. pV3-Gold Visualization Environment for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa L.

    1997-01-01

    A new visualization environment, pV3-Gold, can be used during and after a computer simulation to extract and visualize the physical features in the results. This environment, which is an extension of the pV3 visualization environment developed at the Massachusetts Institute of Technology with guidance and support by researchers at the NASA Lewis Research Center, features many tools that allow users to display data in various ways.

  12. Para: a computer simulation code for plasma driven electromagnetic launchers

    SciTech Connect

    Thio, Y.-C.

    1983-03-01

    A computer code for simulation of rail-type accelerators utilizing a plasma armature has been developed and is described in detail. Some time varying properties of the plasma are taken into account in this code thus allowing the development of a dynamical model of the behavior of a plasma in a rail-type electromagnetic launcher. The code is being successfully used to predict and analyse experiments on small calibre rail-gun launchers.

  13. Ku-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Magnusson, H. G.; Goff, M. F.

    1984-01-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  14. Cane Toad or Computer Mouse? Real and Computer-Simulated Laboratory Exercises in Physiology Classes

    ERIC Educational Resources Information Center

    West, Jan; Veenstra, Anneke

    2012-01-01

    Traditional practical classes in many countries are being rationalised to reduce costs. The challenge for university educators is to provide students with the opportunity to reinforce theoretical concepts by running something other than a traditional practical program. One alternative is to replace wet labs with comparable computer simulations.…

  15. Improving computational efficiency of Monte Carlo simulations with variance reduction

    SciTech Connect

    Turner, A.

    2013-07-01

    CCFE perform Monte-Carlo transport simulations on large and complex tokamak models such as ITER. Such simulations are challenging since streaming and deep penetration effects are equally important. In order to make such simulations tractable, both variance reduction (VR) techniques and parallel computing are used. It has been found that the application of VR techniques in such models significantly reduces the efficiency of parallel computation due to 'long histories'. VR in MCNP can be accomplished using energy-dependent weight windows. The weight window represents an 'average behaviour' of particles, and large deviations in the arriving weight of a particle give rise to extreme amounts of splitting being performed and a long history. When running on parallel clusters, a long history can have a detrimental effect on the parallel efficiency - if one process is computing the long history, the other CPUs complete their batch of histories and wait idle. Furthermore some long histories have been found to be effectively intractable. To combat this effect, CCFE has developed an adaptation of MCNP which dynamically adjusts the WW where a large weight deviation is encountered. The method effectively 'de-optimises' the WW, reducing the VR performance but this is offset by a significant increase in parallel efficiency. Testing with a simple geometry has shown the method does not bias the result. This 'long history method' has enabled CCFE to significantly improve the performance of MCNP calculations for ITER on parallel clusters, and will be beneficial for any geometry combining streaming and deep penetration effects. (authors)

  16. Computer Simulation of Intergranular Stress Corrosion Cracking via Hydrogen Embrittlement

    SciTech Connect

    Smith, R.W.

    2000-04-01

    Computer simulation has been applied to the investigation of intergranular stress corrosion cracking in Ni-based alloys based on a hydrogen embrittlement mechanism. The simulation employs computational modules that address (a) transport and reactions of aqueous species giving rise to hydrogen generation at the liquid-metal interface, (b) solid state transport of hydrogen via intergranular and transgranular diffusion pathways, and (c) fracture due to the embrittlement of metallic bonds by hydrogen. A key focus of the computational model development has been the role of materials microstructure (precipitate particles and grain boundaries) on hydrogen transport and embrittlement. Simulation results reveal that intergranular fracture is enhanced as grain boundaries are weakened and that microstructures with grains elongated perpendicular to the stress axis are more susceptible to cracking. The presence of intergranular precipitates may be expected to either enhance or impede cracking depending on the relative distribution of hydrogen between the grain boundaries and the precipitate-matrix interfaces. Calculations of hydrogen outgassing and in gassing demonstrate a strong effect of charging method on the fracture behavior.

  17. Scalable High Performance Computing: Direct and Large-Eddy Turbulent Flow Simulations Using Massively Parallel Computers

    NASA Technical Reports Server (NTRS)

    Morgan, Philip E.

    2004-01-01

    This final report contains reports of research related to the tasks "Scalable High Performance Computing: Direct and Lark-Eddy Turbulent FLow Simulations Using Massively Parallel Computers" and "Devleop High-Performance Time-Domain Computational Electromagnetics Capability for RCS Prediction, Wave Propagation in Dispersive Media, and Dual-Use Applications. The discussion of Scalable High Performance Computing reports on three objectives: validate, access scalability, and apply two parallel flow solvers for three-dimensional Navier-Stokes flows; develop and validate a high-order parallel solver for Direct Numerical Simulations (DNS) and Large Eddy Simulation (LES) problems; and Investigate and develop a high-order Reynolds averaged Navier-Stokes turbulence model. The discussion of High-Performance Time-Domain Computational Electromagnetics reports on five objectives: enhancement of an electromagnetics code (CHARGE) to be able to effectively model antenna problems; utilize lessons learned in high-order/spectral solution of swirling 3D jets to apply to solving electromagnetics project; transition a high-order fluids code, FDL3DI, to be able to solve Maxwell's Equations using compact-differencing; develop and demonstrate improved radiation absorbing boundary conditions for high-order CEM; and extend high-order CEM solver to address variable material properties. The report also contains a review of work done by the systems engineer.

  18. IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report

    SciTech Connect

    William M. Bond; Salih Ersayin

    2007-03-30

    This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency of individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern

  19. Numerical simulations of the thermoacoustic computed tomography breast imaging system

    NASA Astrophysics Data System (ADS)

    Kiser, William Lester, Jr.

    A thermoacoustic wave is produced when an object absorbs energy and experiences a subsequent thermal expansion. We have developed a Thermoacoustic Computed Tomography (TACT) breast imaging system to exploit the thermoacoustic phenomena as a method of soft tissue imaging. By exposing the breast to short pulses of 434 MHz microwaves, ultrasonic pulses are generated and detected with a hemispherical transducer array submersed in a water bath. Filtering and back projecting the transducer signals generates a 3-D image that maps the localized microwave absorption properties of the breast. In an effort to understand the factors limiting image quality, the TACT system was numerically simulated. The simulations were used to generate the transducer signals that would be collected by the TACT system during a scan of an object. These simulated data streams were then fed into the system image reconstruction software to provide images of simulated phantoms. The effects of transducer diameter, transducer response, transducer array geometry and stimulating pulse width on the spatial and contrast resolution of the system were quantified using the simulations. The spatial resolution was highly dependent upon location in the imaging volume. This was due to the off axis response of transducers of finite aperture. Simulated data were compared with experimental data, obtained by imaging a parallel-piped resolution phantom, to verify the accuracy of the simulation code. A contrast-detail phantom was numerically simulated to determine the ability of the system to image spheres of diameters <1 cm with absorption values on the order of physiologic saline, when located in a background of noise. The results of the contrast-detail analysis were dependent on the location of the spheres in the imaging volume and the diameter of the simulated transducers. This work sets the foundation for the initial image quality studies of the TACT system. Improvements to the current imaging system, based on

  20. Computer Simulations of Voltage-Gated Cation Channels

    PubMed Central

    Treptow, Werner; Klein, Michael L.

    2012-01-01

    The relentless growth in computational power has seen increasing applications of molecular dynamics (MD) simulation to the study of membrane proteins in realistic membrane environments, which include explicit membrane lipids, water and ions. The concomitant increasing availability of membrane protein structures for ion channels, and transporters -- to name just two examples -- has stimulated many of these MD studies. In the case of voltage-gated cation channels (VGCCs) recent computational works have focused on ion-conduction and gating mechanisms, along with their regulation by agonist/antagonist ligands. The information garnered from these computational studies is largely inaccessible to experiment and is crucial for understanding the interplay between the structure and function as well as providing new directions for experiments. This article highlights recent advances in probing the structure and function of potassium channels and offers a perspective on the challenges likely to arise in making analogous progress in characterizing sodium channels. PMID:22523619

  1. The smart vapor retarder: An innovation inspired by computer simulations

    SciTech Connect

    Kuenzel, H.M.

    1998-12-31

    Water management is the new trend in civil engineering. Since it is difficult to ensure perfect vapor- and watertightness of building components, a limited moisture ingress is acceptable as long as the drying process is effective enough to avoid moisture damage. Recent computer models for the simulation of heat and moisture transport are valuable tools for the risk assessment of structures and their repair or retrofit. Unventilated, insulated assemblies with a vapor-resistant exterior layer can accumulate water because winter condensation and summer drying are not balanced. The balance can be reestablished if the vapor retarder is more permeable in summer than in winter. Parametric computer studies have defined the required properties of such a vapor retarder. Developed according to the computed specifications, the smart vapor retarder shows a seasonal variation in vapor permeability of a factor of ten. The secret of this behavior lies in the humidity-dependent vapor diffusion resistance of the film material.

  2. Computer simulation in the daily practice of orthognathic surgery.

    PubMed

    Schendel, S A

    2015-12-01

    The availability of computers and advances in imaging, especially over the last 10 years, have allowed the adoption of three-dimensional (3D) imaging in the office setting. The affordability and ease of use of this modality has led to its widespread implementation in diagnosis and treatment planning, teaching, and follow-up care. 3D imaging is particularly useful when the deformities are complex and involve both function and aesthetics, such as those in the dentofacial area, and for orthognathic surgery. Computer imaging involves combining images obtained from different modalities to create a virtual record of an individual. In this article, the system is described and its use in the office demonstrated. Computer imaging with simulation, and more specifically patient-specific anatomic records (PSAR), permit a more accurate analysis of the deformity as an aid to diagnosis and treatment planning. 3D imaging and computer simulation can be used effectively for the planning of office-based procedures. The technique can be used to perform virtual surgery and establish a definitive and objective treatment plan for correction of the facial deformity. In addition, patient education and follow-up can be facilitated. The end result is improved patient care and decreased expense.

  3. An FPGA computing demo core for space charge simulation

    SciTech Connect

    Wu, Jinyuan; Huang, Yifei; /Fermilab

    2009-01-01

    In accelerator physics, space charge simulation requires large amount of computing power. In a particle system, each calculation requires time/resource consuming operations such as multiplications, divisions, and square roots. Because of the flexibility of field programmable gate arrays (FPGAs), we implemented this task with efficient use of the available computing resources and completely eliminated non-calculating operations that are indispensable in regular micro-processors (e.g. instruction fetch, instruction decoding, etc.). We designed and tested a 16-bit demo core for computing Coulomb's force in an Altera Cyclone II FPGA device. To save resources, the inverse square-root cube operation in our design is computed using a memory look-up table addressed with nine to ten most significant non-zero bits. At 200 MHz internal clock, our demo core reaches a throughput of 200 M pairs/s/core, faster than a typical 2 GHz micro-processor by about a factor of 10. Temperature and power consumption of FPGAs were also lower than those of micro-processors. Fast and convenient, FPGAs can serve as alternatives to time-consuming micro-processors for space charge simulation.

  4. A framework of modeling detector systems for computed tomography simulations

    NASA Astrophysics Data System (ADS)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  5. Numerical simulation of landfill aeration using computational fluid dynamics.

    PubMed

    Fytanidis, Dimitrios K; Voudrias, Evangelos A

    2014-04-01

    The present study is an application of Computational Fluid Dynamics (CFD) to the numerical simulation of landfill aeration systems. Specifically, the CFD algorithms provided by the commercial solver ANSYS Fluent 14.0, combined with an in-house source code developed to modify the main solver, were used. The unsaturated multiphase flow of air and liquid phases and the biochemical processes for aerobic biodegradation of the organic fraction of municipal solid waste were simulated taking into consideration their temporal and spatial evolution, as well as complex effects, such as oxygen mass transfer across phases, unsaturated flow effects (capillary suction and unsaturated hydraulic conductivity), temperature variations due to biochemical processes and environmental correction factors for the applied kinetics (Monod and 1st order kinetics). The developed model results were compared with literature experimental data. Also, pilot scale simulations and sensitivity analysis were implemented. Moreover, simulation results of a hypothetical single aeration well were shown, while its zone of influence was estimated using both the pressure and oxygen distribution. Finally, a case study was simulated for a hypothetical landfill aeration system. Both a static (steadily positive or negative relative pressure with time) and a hybrid (following a square wave pattern of positive and negative values of relative pressure with time) scenarios for the aeration wells were examined. The results showed that the present model is capable of simulating landfill aeration and the obtained results were in good agreement with corresponding previous experimental and numerical investigations.

  6. Precision Constraints from Computational Cosmology and Type Ia Supernova Simulations

    NASA Astrophysics Data System (ADS)

    Bernstein, Joseph P.; Kuhlmann, S. E.; Norris, B.; Biswas, R.

    2011-01-01

    The evidence for dark energy represents one of the greatest mysteries of modern science. The research undertaken probes the implications of dark energy via analysis of large scale structure and detonation-based Type Ia supernova light curve simulations. It is presently an exciting time to be involved in cosmology because planned astronomical surveys will effectively result in dark sector probes becoming systematics-limited, making numerical simulations crucial to the formulation of precision constraints. This work aims to assist in reaching the community goal of 1% constraints on the dark energy equation of state parameter. Reaching this goal will require 1) hydrodynamic+N-body simulations with a minimum of a 1 Gpc box size, 20483 hydrodynamic cells, and 1011 dark matter particles, which push the limits of existing codes, and 2) a better understanding of the explosion mechanism(s) for Type Ia supernovae, together with larger, high-quality data sets from present and upcoming supernova surveys. Initial results are discussed from two projects. The first is computational cosmology studies aimed at enabling the large simulations discussed above. The second is radiative transfer calculations drawn from Type Ia supernova explosion simulations aimed at bridging the gap between simulated light curves and those observed from, e.g., the Sloan Digital Sky Survey II and, eventually, the Dark Energy Survey.

  7. Charge-leveling and proper treatment of long-range electrostatics in all-atom molecular dynamics at constant pH

    NASA Astrophysics Data System (ADS)

    Wallace, Jason A.; Shen, Jana K.

    2012-11-01

    Recent development of constant pH molecular dynamics (CpHMD) methods has offered promise for adding pH-stat in molecular dynamics simulations. However, until now the working pH molecular dynamics (pHMD) implementations are dependent in part or whole on implicit-solvent models. Here we show that proper treatment of long-range electrostatics and maintaining charge neutrality of the system are critical for extending the continuous pHMD framework to the all-atom representation. The former is achieved here by adding forces to titration coordinates due to long-range electrostatics based on the generalized reaction field method, while the latter is made possible by a charge-leveling technique that couples proton titration with simultaneous ionization or neutralization of a co-ion in solution. We test the new method using the pH-replica-exchange CpHMD simulations of a series of aliphatic dicarboxylic acids with varying carbon chain length. The average absolute deviation from the experimental pKa values is merely 0.18 units. The results show that accounting for the forces due to extended electrostatics removes the large random noise in propagating titration coordinates, while maintaining charge neutrality of the system improves the accuracy in the calculated electrostatic interaction between ionizable sites. Thus, we believe that the way is paved for realizing pH-controlled all-atom molecular dynamics in the near future.

  8. Simulation of Tailrace Hydrodynamics Using Computational Fluid Dynamics Models

    SciTech Connect

    Cook, Christopher B.; Richmond, Marshall C.

    2001-05-01

    This report investigates the feasibility of using computational fluid dynamics (CFD) tools to investigate hydrodynamic flow fields surrounding the tailrace zone below large hydraulic structures. Previous and ongoing studies using CFD tools to simulate gradually varied flow with multiple constituents and forebay/intake hydrodynamics have shown that CFD tools can provide valuable information for hydraulic and biological evaluation of fish passage near hydraulic structures. These studies however are incapable of simulating the rapidly varying flow fields that involving breakup of the free-surface, such as those through and below high flow outfalls and spillways. Although the use of CFD tools for these types of flow are still an active area of research, initial applications discussed in this report show that these tools are capable of simulating the primary features of these highly transient flow fields.

  9. Lightweight computational steering of very large scale molecular dynamics simulations

    SciTech Connect

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.

  10. Computer simulation of cardiovascular changes during extended duration space flights

    NASA Technical Reports Server (NTRS)

    Srinivasan, R. Srini; Charles, John B.; Leonard, Joel I.

    1990-01-01

    The application of mathematical modeling and computer simulation to the study of spaceflight cardiovascular changes is examined using a multicompartment representation model of the entire human cardiovascular system including its control elements. The model simulates the beat-to-beat dynamic responses of the cardiovascular system to orthostatic stresses. Simulation results pertaining to long-term space flight, the combined effect of +G(z) and blood volume loss, and the effect of anti-G suit inflation are discussed, including past results on the original version of the model which has been used in a number of analysis applications at NASA. New results pertain to analysis of cardiovascular changes in extended duration space flights and demonstrate the use of this model in evaluation of physiological factors that contribute to orthostatic intolerance following an exposure to weightlessness, in particular, blood volume loss and changes in the sensitivity of baroreceptors.

  11. “CHRIS”: A Computer Simulation of Schizophrenia*

    PubMed Central

    Santo, Yoav; Finkel, Andy

    1982-01-01

    “CHRIS” is an experimental computer simulation of a patient with a schizophrenic disorder responding to an initial diagnostic interview with a clinician. The program is designed as a teaching aid in psychiatric interviewing and diagnosis. The user of the simulation assumes the role of the “clinician” conducting a diagnostic interview, Upon completion of the interview, the program checks the user's diagnosis for accuracy; it reports a corrected diagnosis if necessary; and finally, it lists all schizophrenic symptoms displayed during that interview. The program is composed of a dictionary of about 800 words including their functional part of speech, a list of about 200 output sentences, and a parser. The simulation provides the student with training experience in specific aspects of a mental disorder without posing any burden upon a human patient.

  12. Computer simulation of spectrometer magnets for some experimental installations

    NASA Astrophysics Data System (ADS)

    Zhidkov, E. P.; Poljakova, R. V.; Voloshina, I. G.; Perepelkin, E. E.; Rossiyskaya, N. S.; Shavrina, T. V.; Yudin, I. P.

    2009-03-01

    The significance of numerical simulation in the research of magnetic systems is determined by not only known advantages of the computing experiment, but also by the fact that the measurement of a magnetic field is a labour-consuming and expensive problem. Mathematical simulation allows one to investigate those parts of the magnet’s design where the measurements of the magnetic field are extremely complicated or even impossible. This work is aimed to generalize experience of the mathematical simulation of magnetic systems of various-type physical and electromechanical installations and to work out some recommendations of the optimal use of some software products for the numerical modeling of magnetostatic problems. This work also presents some results of a numerical analysis of the magnetic systems of the JINR’s physical installation MARUSYA with the purpose of studying an opportunity of designing magnetic systems with predetermined characteristics of the magnetic field.

  13. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  14. Computer Support of Operator Training: Constructing and Testing a Prototype of a CAL (Computer Aided Learning) Supported Simulation Environment.

    ERIC Educational Resources Information Center

    Zillesen, P. G. van Schaick; And Others

    Instructional feedback given to the learners during computer simulation sessions may be greatly improved by integrating educational computer simulation programs with hypermedia-based computer-assisted learning (CAL) materials. A prototype of a learning environment of this type called BRINE PURIFICATION was developed for use in corporate training…

  15. Multipole Algorithms for Molecular Dynamics Simulation on High Performance Computers.

    NASA Astrophysics Data System (ADS)

    Elliott, William Dewey

    1995-01-01

    A fundamental problem in modeling large molecular systems with molecular dynamics (MD) simulations is the underlying N-body problem of computing the interactions between all pairs of N atoms. The simplest algorithm to compute pair-wise atomic interactions scales in runtime {cal O}(N^2), making it impractical for interesting biomolecular systems, which can contain millions of atoms. Recently, several algorithms have become available that solve the N-body problem by computing the effects of all pair-wise interactions while scaling in runtime less than {cal O}(N^2). One algorithm, which scales {cal O}(N) for a uniform distribution of particles, is called the Greengard-Rokhlin Fast Multipole Algorithm (FMA). This work describes an FMA-like algorithm called the Molecular Dynamics Multipole Algorithm (MDMA). The algorithm contains several features that are new to N-body algorithms. MDMA uses new, efficient series expansion equations to compute general 1/r^{n } potentials to arbitrary accuracy. In particular, the 1/r Coulomb potential and the 1/r^6 portion of the Lennard-Jones potential are implemented. The new equations are based on multivariate Taylor series expansions. In addition, MDMA uses a cell-to-cell interaction region of cells that is closely tied to worst case error bounds. The worst case error bounds for MDMA are derived in this work also. These bounds apply to other multipole algorithms as well. Several implementation enhancements are described which apply to MDMA as well as other N-body algorithms such as FMA and tree codes. The mathematics of the cell -to-cell interactions are converted to the Fourier domain for reduced operation count and faster computation. A relative indexing scheme was devised to locate cells in the interaction region which allows efficient pre-computation of redundant information and prestorage of much of the cell-to-cell interaction. Also, MDMA was integrated into the MD program SIgMA to demonstrate the performance of the program over

  16. Accelerating Climate and Weather Simulations through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  17. Computational simulation of materials notes for lectures given at UCSB, May 1996--June 1996

    SciTech Connect

    LeSar, R.

    1997-01-01

    This report presents information from a lecture given on the computational simulation of materials. The purpose is to introduce modern computerized simulation methods for materials properties and response.

  18. Three Dimensional Computer Graphics Federates for the 2012 Smackdown Simulation

    NASA Technical Reports Server (NTRS)

    Fordyce, Crystal; Govindaiah, Swetha; Muratet, Sean; O'Neil, Daniel A.; Schricker, Bradley C.

    2012-01-01

    The Simulation Interoperability Standards Organization (SISO) Smackdown is a two-year old annual event held at the 2012 Spring Simulation Interoperability Workshop (SIW). A primary objective of the Smackdown event is to provide college students with hands-on experience in developing distributed simulations using High Level Architecture (HLA). Participating for the second time, the University of Alabama in Huntsville (UAHuntsville) deployed four federates, two federates simulated a communications server and a lunar communications satellite with a radio. The other two federates generated 3D computer graphics displays for the communication satellite constellation and for the surface based lunar resupply mission. Using the Light-Weight Java Graphics Library, the satellite display federate presented a lunar-texture mapped sphere of the moon and four Telemetry Data Relay Satellites (TDRS), which received object attributes from the lunar communications satellite federate to drive their motion. The surface mission display federate was an enhanced version of the federate developed by ForwardSim, Inc. for the 2011 Smackdown simulation. Enhancements included a dead-reckoning algorithm and a visual indication of which communication satellite was in line of sight of Hadley Rille. This paper concentrates on these two federates by describing the functions, algorithms, HLA object attributes received from other federates, development experiences and recommendations for future, participating Smackdown teams.

  19. Further developments in cloud statistics for computer simulations

    NASA Technical Reports Server (NTRS)

    Chang, D. T.; Willand, J. H.

    1972-01-01

    This study is a part of NASA's continued program to provide global statistics of cloud parameters for computer simulation. The primary emphasis was on the development of the data bank of the global statistical distributions of cloud types and cloud layers and their applications in the simulation of the vertical distributions of in-cloud parameters such as liquid water content. These statistics were compiled from actual surface observations as recorded in Standard WBAN forms. Data for a total of 19 stations were obtained and reduced. These stations were selected to be representative of the 19 primary cloud climatological regions defined in previous studies of cloud statistics. Using the data compiled in this study, a limited study was conducted of the hemogeneity of cloud regions, the latitudinal dependence of cloud-type distributions, the dependence of these statistics on sample size, and other factors in the statistics which are of significance to the problem of simulation. The application of the statistics in cloud simulation was investigated. In particular, the inclusion of the new statistics in an expanded multi-step Monte Carlo simulation scheme is suggested and briefly outlined.

  20. A computationally efficient particle-simulation method suited to vector-computer architectures

    SciTech Connect

    McDonald, J.D.

    1990-01-01

    Recent interest in a National Aero-Space Plane (NASP) and various Aero-assisted Space Transfer Vehicles (ASTVs) presents the need for a greater understanding of high-speed rarefied flight conditions. Particle simulation techniques such as the Direct Simulation Monte Carlo (DSMC) method are well suited to such problems, but the high cost of computation limits the application of the methods to two-dimensional or very simple three-dimensional problems. This research re-examines the algorithmic structure of existing particle simulation methods and re-structures them to allow efficient implementation on vector-oriented supercomputers. A brief overview of the DSMC method and the Cray-2 vector computer architecture are provided, and the elements of the DSMC method that inhibit substantial vectorization are identified. One such element is the collision selection algorithm. A complete reformulation of underlying kinetic theory shows that this may be efficiently vectorized for general gas mixtures. The mechanics of collisions are vectorizable in the DSMC method, but several optimizations are suggested that greatly enhance performance. Also this thesis proposes a new mechanism for the exchange of energy between vibration and other energy modes. The developed scheme makes use of quantized vibrational states and is used in place of the Borgnakke-Larsen model. Finally, a simplified representation of physical space and boundary conditions is utilized to further reduce the computational cost of the developed method. Comparison to solutions obtained from the DSMC method for the relaxation of internal energy modes in a homogeneous gas, as well as single and multiple specie shock wave profiles, are presented. Additionally, a large scale simulation of the flow about the proposed Aeroassisted Flight Experiment (AFE) vehicle is included as an example of the new computational capability of the developed particle simulation method.

  1. An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation

    NASA Technical Reports Server (NTRS)

    Bartos, R. D.

    1993-01-01

    Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.

  2. Fast computer simulation of reconstructed image from rainbow hologram based on GPU

    NASA Astrophysics Data System (ADS)

    Shuming, Jiao; Yoshikawa, Hiroshi

    2015-10-01

    A fast computer simulation solution for rainbow hologram reconstruction based on GPU is proposed. In the commonly used segment Fourier transform method for rainbow hologram reconstruction, the computation of 2D Fourier transform on each hologram segment is very time consuming. GPU-based parallel computing can be applied to improve the computing speed. Compared with CPU computing, simulation results indicate that our proposed GPU computing can effectively reduce the computation time by as much as eight times.

  3. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Zumao Chen; Temi Linjewile; Mike Maguire; Adel Sarofim; Connie Senior; Changguan Yang; Hong-Shig Shim

    2004-04-28

    This is the fourteenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a Virtual Engineering-based framework for simulating the performance of Advanced Power Systems. Within the last quarter, good progress has been made on all aspects of the project. Software development efforts have focused primarily on completing a prototype detachable user interface for the framework and on integrating Carnegie Mellon Universities IECM model core with the computational engine. In addition to this work, progress has been made on several other development and modeling tasks for the program. These include: (1) improvements to the infrastructure code of the computational engine, (2) enhancements to the model interfacing specifications, (3) additional development to increase the robustness of all framework components, (4) enhanced coupling of the computational and visualization engine components, (5) a series of detailed simulations studying the effects of gasifier inlet conditions on the heat flux to the gasifier injector, and (6) detailed plans for implementing models for mercury capture for both warm and cold gas cleanup have been created.

  4. Numerical simulation of NQR/NMR: Applications in quantum computing.

    PubMed

    Possa, Denimar; Gaudio, Anderson C; Freitas, Jair C C

    2011-04-01

    A numerical simulation program able to simulate nuclear quadrupole resonance (NQR) as well as nuclear magnetic resonance (NMR) experiments is presented, written using the Mathematica package, aiming especially applications in quantum computing. The program makes use of the interaction picture to compute the effect of the relevant nuclear spin interactions, without any assumption about the relative size of each interaction. This makes the program flexible and versatile, being useful in a wide range of experimental situations, going from NQR (at zero or under small applied magnetic field) to high-field NMR experiments. Some conditions specifically required for quantum computing applications are implemented in the program, such as the possibility of use of elliptically polarized radiofrequency and the inclusion of first- and second-order terms in the average Hamiltonian expansion. A number of examples dealing with simple NQR and quadrupole-perturbed NMR experiments are presented, along with the proposal of experiments to create quantum pseudopure states and logic gates using NQR. The program and the various application examples are freely available through the link http://www.profanderson.net/files/nmr_nqr.php.

  5. Use of a Computer Simulation To Develop Mental Simulations for Understanding Relative Motion Concepts.

    ERIC Educational Resources Information Center

    Monaghan, James M.; Clement, John

    1999-01-01

    Presents evidence for students' qualitative and quantitative difficulties with apparently simple one-dimensional relative-motion problems, students' spontaneous visualization of relative-motion problems, the visualizations facilitating solution of these problems, and students' memories of the online computer simulation used as a framework for…

  6. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio

    2003-04-25

    This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two gasifier types. An improved process model for simulating entrained flow gasifiers has been implemented into the workbench. Model development has focused on: a pre-processor module to compute global gasification parameters from standard fuel properties and intrinsic rate information; a membrane based water gas shift; and reactors to oxidize fuel cell exhaust gas. The data visualization capabilities of the workbench have been extended by implementing the VTK visualization software that supports advanced visualization methods, including inexpensive Virtual Reality techniques. The ease-of-use, functionality and plug-and-play features of the workbench were highlighted through demonstrations of the workbench at a DOE sponsored coal utilization conference. A white paper has been completed that contains recommendations on the use of component architectures, model interface protocols and software frameworks for developing a Vision 21 plant simulator.

  7. Gastric flow and mixing studied using computer simulation.

    PubMed

    Pal, Anupam; Indireshkumar, Keshavamurthy; Schwizer, Werner; Abrahamsson, Bertil; Fried, Michael; Brasseur, James G

    2004-12-22

    The fed human stomach displays regular peristaltic contraction waves that originate in the proximal antrum and propagate to the pylorus. High-resolution concurrent manometry and magnetic resonance imaging (MRI) studies of the stomach suggest a primary function of antral contraction wave (ACW) activity unrelated to gastric emptying. Detailed evaluation is difficult, however, in vivo. Here we analyse the role of ACW activity on intragastric fluid motions, pressure, and mixing with computer simulation. A two-dimensional computer model of the stomach was developed with the 'lattice-Boltzmann' numerical method from the laws of physics, and stomach geometry modelled from MRI. Time changes in gastric volume were specified to match global physiological rates of nutrient liquid emptying. The simulations predicted two basic fluid motions: retrograde 'jets' through ACWs, and circulatory flow between ACWs, both of which contribute to mixing. A well-defined 'zone of mixing', confined to the antrum, was created by the ACWs, with mixing motions enhanced by multiple and narrower ACWs. The simulations also predicted contraction-induced peristaltic pressure waves in the distal antrum consistent with manometric measurements, but with a much lower pressure amplitude than manometric data, indicating that manometric pressure amplitudes reflect direct contact of the catheter with the gastric wall. We conclude that the ACWs are central to gastric mixing, and may also play an indirect role in gastric emptying through local alterations in common cavity pressure. PMID:15615685

  8. Simulating Subsurface Reactive Flows on Ultrascale Computers with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Hammond, G. E.; Lichtner, P. C.; Lu, C.; Smith, B. F.; Philip, B.

    2009-12-01

    To provide true predictive utility, subsurface simulations often must accurately resolve--in three dimensions--complicated, multi-phase flow fields in highly heterogeneous geology with numerous chemical species and complex chemistry. This task is especially daunting because of the wide range of spatial scales involved--from the pore scale to the field scale--ranging over six orders of magnitude, and the wide range of time scales ranging from seconds or less to millions of years. This represents a true "Grand Challenge" computational problem, requiring not only the largest-scale ("ultrascale") supercomputers, but accompanying advances in algorithms for the efficient numerical solution of systems of PDEs using these machines, and in mathematical modeling techniques that can adequately capture the truly multi-scale nature of these problems. We describe some of the specific challenges involved and present the software and algorithmic approaches that are being using in the computer code PFLOTRAN to provide scalable performance for such simulations on tens of thousands of processors. We focus particularly on scalable techniques for solving the large (up to billions of total degrees of freedom), sparse algebraic systems that arise. We also describe ongoing work to address disparate time and spatial scales by both the development of adaptive mesh refinement methods and the use of multiple continuum formulations. Finally, we present some examples from recent simulations conducted on Jaguar, the 150152 processor core Cray XT5 system at Oak Ridge National Laboratory that is currently one of the most powerful supercomputers in the world.

  9. Trace contaminant control simulation computer program, version 8.1

    NASA Technical Reports Server (NTRS)

    Perry, J. L.

    1994-01-01

    The Trace Contaminant Control Simulation computer program is a tool for assessing the performance of various process technologies for removing trace chemical contamination from a spacecraft cabin atmosphere. Included in the simulation are chemical and physical adsorption by activated charcoal, chemical adsorption by lithium hydroxide, absorption by humidity condensate, and low- and high-temperature catalytic oxidation. Means are provided for simulating regenerable as well as nonregenerable systems. The program provides an overall mass balance of chemical contaminants in a spacecraft cabin given specified generation rates. Removal rates are based on device flow rates specified by the user and calculated removal efficiencies based on cabin concentration and removal technology experimental data. Versions 1.0 through 8.0 are documented in NASA TM-108409. TM-108409 also contains a source file listing for version 8.0. Changes to version 8.0 are documented in this technical memorandum and a source file listing for the modified version, version 8.1, is provided. Detailed descriptions for the computer program subprograms are extracted from TM-108409 and modified as necessary to reflect version 8.1. Version 8.1 supersedes version 8.0. Information on a separate user's guide is available from the author.

  10. Benchmarking computational fluid dynamics models for lava flow simulation

    NASA Astrophysics Data System (ADS)

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi

    2016-04-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, and COMSOL. Using the new benchmark scenarios defined in Cordonnier et al. (Geol Soc SP, 2015) as a guide, we model viscous, cooling, and solidifying flows over horizontal and sloping surfaces, topographic obstacles, and digital elevation models of natural topography. We compare model results to analytical theory, analogue and molten basalt experiments, and measurements from natural lava flows. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We can apply these models to reconstruct past lava flows in Hawai'i and Saudi Arabia using parameters assembled from morphology, textural analysis, and eruption observations as natural test cases. Our study highlights the strengths and weaknesses of each code, including accuracy and computational costs, and provides insights regarding code selection.

  11. Computer simulation on reconstruction of 3-D flame temperature distribution

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Yung, K. L.; Wu, Z.; Li, T.

    To measure non-symmetric unsteady three dimensional temperature distribution in flame by simple, economic, fast and accurate means, and to apply a priori information to the measurement both sufficiently and efficiently, we conducted computer simulations. Simulation results proved that finite series-expansion reconstruction method is more suitable for measurement of temperature distribution in flame than transform method which is widely used in medical scanning and nondestructive testing. By comparing errors of simulations with different numbers of views, different domain shapes, different numbers of projections per view, different angles of views and different grid shapes, etc., we find that circle domain, triangular grid and sufficient number of projections per view, can improve the accuracy in the reconstruction of 3-D temperature distribution with limited views. With six views, errors caused by reconstruction computation are reduced, they are smaller than those caused by measurement. Therefore, a comparatively better means of measuring 3-D temperature distribution in flame with limited projection views by emission tomography is achieved. Experimental results also showed that the method we used was appropriate for measurement of 3-D temperature distribution with limited number of views [1].

  12. Textbook Multigrid Efficiency for Computational Fluid Dynamics Simulations

    NASA Technical Reports Server (NTRS)

    Brandt, Achi; Thomas, James L.; Diskin, Boris

    2001-01-01

    Considerable progress over the past thirty years has been made in the development of large-scale computational fluid dynamics (CFD) solvers for the Euler and Navier-Stokes equations. Computations are used routinely to design the cruise shapes of transport aircraft through complex-geometry simulations involving the solution of 25-100 million equations; in this arena the number of wind-tunnel tests for a new design has been substantially reduced. However, simulations of the entire flight envelope of the vehicle, including maximum lift, buffet onset, flutter, and control effectiveness have not been as successful in eliminating the reliance on wind-tunnel testing. These simulations involve unsteady flows with more separation and stronger shock waves than at cruise. The main reasons limiting further inroads of CFD into the design process are: (1) the reliability of turbulence models; and (2) the time and expense of the numerical simulation. Because of the prohibitive resolution requirements of direct simulations at high Reynolds numbers, transition and turbulence modeling is expected to remain an issue for the near term. The focus of this paper addresses the latter problem by attempting to attain optimal efficiencies in solving the governing equations. Typically current CFD codes based on the use of multigrid acceleration techniques and multistage Runge-Kutta time-stepping schemes are able to converge lift and drag values for cruise configurations within approximately 1000 residual evaluations. An optimally convergent method is defined as having textbook multigrid efficiency (TME), meaning the solutions to the governing system of equations are attained in a computational work which is a small (less than 10) multiple of the operation count in the discretized system of equations (residual equations). In this paper, a distributed relaxation approach to achieving TME for Reynolds-averaged Navier-Stokes (RNAS) equations are discussed along with the foundations that form the

  13. A Computer-Based Simulation of an Acid-Base Titration

    ERIC Educational Resources Information Center

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  14. Analysis of solvation structure and thermodynamics of ethane and propane in water by reference interaction site model theory using all-atom models

    NASA Astrophysics Data System (ADS)

    Cui, Qizhi; Smith, Vedene H.

    2001-08-01

    Following our previous paper on methane [Cui and Smith, J. Chem. Phys. 113, 10240 (2000)], we study the solvation structures and thermodynamics of ethane and propane in water at the infinite dilution limit by using the hypernetted chain closure reference interaction site model (HNC-RISM) theory with all-atom representations for solute molecules. At four thermodynamic states: temperature T=283.15, 298.15, 313.15, 328.15 K and the corresponding bulk water density ρ=0.9997, 0.9970, 0.9922, 0.9875 g cm-3, all the atomic solute-solvent radial distribution functions are obtained, and the corresponding running coordination numbers and the hydration free energies, energies, enthalpies, and entropies are calculated with the radial distribution functions as input. The hydration structures of ethane and propane are presented and analyzed at the atomic level in terms of the atomic solute-solvent radial distribution functions. With the optimized nonbonded potential parameters based on the CHARMM96 all-atom model for alkanes [Yin and Mackerell, J. Comput. Chem. 19, 334 (1998)], the ethane and propane hydration thermodynamic properties predicted by the HNC-RISM theory are improved in the specified temperature range (10-55 °C).

  15. Probabilistic lifetime strength of aerospace materials via computational simulation

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Keating, Jerome P.; Lovelace, Thomas B.; Bast, Callie C.

    1991-01-01

    The results of a second year effort of a research program are presented. The research included development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic phenomenological constitutive relationship, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects of primitive variables. These primitive variables often originate in the environment and may include stress from loading, temperature, chemical, or radiation attack. This multifactor interaction constitutive equation is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the constitutive equation using actual experimental materials data together with the multiple linear regression of that data.

  16. Computational strategies in the dynamic simulation of constrained flexible MBS

    NASA Technical Reports Server (NTRS)

    Amirouche, F. M. L.; Xie, M.

    1993-01-01

    This research focuses on the computational dynamics of flexible constrained multibody systems. At first a recursive mapping formulation of the kinematical expressions in a minimum dimension as well as the matrix representation of the equations of motion are presented. The method employs Kane's equation, FEM, and concepts of continuum mechanics. The generalized active forces are extended to include the effects of high temperature conditions, such as creep, thermal stress, and elastic-plastic deformation. The time variant constraint relations for rolling/contact conditions between two flexible bodies are also studied. The constraints for validation of MBS simulation of gear meshing contact using a modified Timoshenko beam theory are also presented. The last part deals with minimization of vibration/deformation of the elastic beam in multibody systems making use of time variant boundary conditions. The above methodologies and computational procedures developed are being implemented in a program called DYAMUS.

  17. Inhibition of quorum sensing in a computational biofilm simulation.

    PubMed

    Fozard, J A; Lees, M; King, J R; Logan, B S

    2012-08-01

    Bacteria communicate through small diffusible molecules in a process known as quorum sensing. Quorum-sensing inhibitors are compounds which interfere with this, providing a potential treatment for infections associated with bacterial biofilms. We present an individual-based computational model for a developing biofilm. Cells are aggregated into particles for computational efficiency, but the quorum-sensing mechanism is modelled as a stochastic process on the level of individual cells. Simulations are used to investigate different treatment regimens. The response to the addition of inhibitor is found to depend significantly on the form of the positive feedback in the quorum-sensing model; in cases where the model exhibits bistability, the time at which treatment is initiated proves to be critical for the effective prevention of quorum sensing and hence potentially of virulence. PMID:22374433

  18. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Zumao Chen; Temi Linjewile; Adel Sarofim; Bene Risio

    2003-01-25

    This is the eighth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on all aspects of the project. Calculations for a full Vision 21 plant configuration have been performed for two coal types and two gasifier types. Good agreement with DOE computed values has been obtained for the Vision 21 configuration under ''baseline'' conditions. Additional model verification has been performed for the flowing slag model that has been implemented into the CFD based gasifier model. Comparisons for the slag, wall and syngas conditions predicted by our model versus values from predictive models that have been published by other researchers show good agreement. The software infrastructure of the Vision 21 workbench has been modified to use a recently released, upgraded version of SCIRun.

  19. Nonlinear simulations with and computational issues for NIMROD

    SciTech Connect

    Sovinec, C.R.

    1998-12-31

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries. The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.

  20. Flight Simulation of Taketombo Based on Computational Fluid Dynamics and Computational Flight Dynamics

    NASA Astrophysics Data System (ADS)

    Kawamura, Kohei; Ueno, Yosuke; Nakamura, Yoshiaki

    In the present study we have developed a numerical method to simulate the flight dynamics of a small flying body with unsteady motion, where both aerodynamics and flight dynamics are fully considered. A key point of this numerical code is to use computational fluid dynamics and computational flight dynamics at the same time, which is referred to as CFD2, or double CFDs, where several new ideas are adopted in the governing equations, the method to make each quantity nondimensional, and the coupling method between aerodynamics and flight dynamics. This numerical code can be applied to simulate the unsteady motion of small vehicles such as micro air vehicles (MAV). As a sample calculation, we take up Taketombo, or a bamboo dragonfly, and its free flight in the air is demonstrated. The eventual aim of this research is to virtually fly an aircraft with arbitrary motion to obtain aerodynamic and flight dynamic data, which cannot be taken in the conventional wind tunnel.

  1. Using computer simulations to study relativistic heavy ion collisions

    NASA Astrophysics Data System (ADS)

    Murray, Joelle Lynn

    1998-12-01

    One of the most exciting topics in high-energy nuclear physics is the study of the potential phase transition between hadronic and partonic matter. Information about this transition, if it exists and can be experimentally determined, would be vital in understanding confinement of quarks and gluons inside hadrons. New accelerators, RHIC and LIIC, will be online in the next few years and will focus on finding evidence for this transition. RHIC will collide Au on Au at center of mass energies equal to 200 GeV/nucleon and create a high density, high temperature state of matter. To study the large particle multiplicities that will occur at these experiments, computer simulations are being developed. Within this thesis, one type of simulation will be detailed and used to study the invariant mass spectrum of leptons pairs measured at CERN SPS and several hadronic observables that could be measured at RHIC.

  2. Software Development Processes Applied to Computational Icing Simulation

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  3. Computer simulation of macrosegregation in directionally solidified circular ingots

    NASA Technical Reports Server (NTRS)

    Yeum, K. S.; Poirier, D. R.

    1988-01-01

    The formulation and employment of a computer code designed to simulate the directional solidification of lead-rich Pb-Sn alloys in the form of an ingot with a uniform and circular cross-section are described. The formulation is for steady-state solidification in which convection in the all-liquid zone is ignored. Particular attention was given to designing a code to simulate the effect of a subtle variation of temperature in the radial direction. This is important because a very small temperature difference between the center and the surface of the ingot (e.g., less than 0.5 C ) is enough to cause substantial convection within the mushy-zone when the solidification rate is approximately 0.001 to 0.0001 cm/s.

  4. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  5. Adaptive quantum computation in changing environments using projective simulation

    NASA Astrophysics Data System (ADS)

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-08-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  6. Automatic computer-aided system of simulating solder joint formation

    NASA Astrophysics Data System (ADS)

    Zhao, Xiujuan; Wang, Chunqing; Zheng, Guanqun; Wang, Gouzhong; Yang, Shiqin

    1999-08-01

    One critical aspect in electronic packaging is the fatigue/creep-induced failure in solder interconnections, which is found to be highly dependent on the shape of solder joints. Thus predicting and analyzing the solder joint shape is warranted. In this paper, an automatic computer-aided system is developed to simulate the formation of solder joint and analyze the influence of the different process parameters on the solder joint shape. The developed system is capable of visually designing the process parameters and calculating the solder joint shape automatically without any intervention from the user. The automation achieved will enable fast shape estimation with the variation of process parameters without time consuming experiments, and the simulating system provides the design and manufacturing engineers an efficient software tools to design soldering process in design environment. Moreover, a program developed from the system can serve as the preprocessor for subsequent finite element joint analysis program.

  7. Simulation of computed radiography with imaging plate detectors

    SciTech Connect

    Tisseur, D.; Costin, M.; Mathy, F.; Schumm, A.

    2014-02-18

    Computed radiography (CR) using phosphor imaging plate detectors is taking an increasing place in Radiography Testing. CR uses similar equipment as conventional radiography except that the classical X-ray film is replaced by a numerical detector, called image plate (IP), which is made of a photostimulable layer and which is read by a scanning device through photostimulated luminescence. Such digital radiography has already demonstrated important benefits in terms of exposure time, decrease of source energies and thus reduction of radioprotection area besides being a solution without effluents. This paper presents a model for the simulation of radiography with image plate detectors in CIVA together with examples of validation of the model. The study consists in a cross comparison between experimental and simulation results obtained on a step wedge with a classical X-ray tube. Results are proposed in particular with wire Image quality Indicator (IQI) and duplex IQI.

  8. Computer simulation in template-directed oligonucleotide synthesis

    NASA Technical Reports Server (NTRS)

    Kanavarioti, Anastassia; Benasconi, Claude F.

    1990-01-01

    It is commonly assumed that template-directed polymerizations have played a key role in prebiotic evolution. A computer simulation that models up to 33 competing reactions was used to investigate the product distribution in a template-directed oligonucleotide synthesis as a function of time and concentration of the reactants. The study focuses on the poly(C)-directed elongation reaction of oligoguanylates, and how it is affected by the competing processes of hydrolysis and dimerization of the activated monomer, which have the potential of severely curtailing the elongation and reducing the size and yield of the synthesized polymers. The simulations show that realistic and probably prebiotically plausible conditions can be found where hydrolysis and dimerization are either negligible or where a high degree of polymerization can be attained even in the face of substantial hydrolysis and/or dimerization.

  9. Computer simulations for internal dosimetry using voxel models.

    PubMed

    Kinase, Sakae; Mohammadi, Akram; Takahashi, Masa; Saito, Kimiaki; Zankl, Maria; Kramer, Richard

    2011-07-01

    In the Japan Atomic Energy Agency, several studies have been conducted on the use of voxel models for internal dosimetry. Absorbed fractions (AFs) and S values have been evaluated for preclinical assessments of radiopharmaceuticals using human voxel models and a mouse voxel model. Computational calibration of in vivo measurement system has been also made using Japanese and Caucasian voxel models. In addition, for radiation protection of the environment, AFs have been evaluated using a frog voxel model. Each study was performed by using Monte Carlo simulations. Consequently, it was concluded that these data of Monte Carlo simulations and voxel models could adequately reproduce measurement results. Voxel models were found to be a significant tool for internal dosimetry since the models are anatomically realistic. This fact indicates that several studies on correction of the in vivo measurement efficiency for the variability of human subjects and interspecies scaling of organ doses will succeed.

  10. A COMPUTATIONAL WORKBENCH ENVIRONMENT FOR VIRTUAL POWER PLANT SIMULATION

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Connie Senior; Adel Sarofim; Bene Risio

    2002-07-28

    This is the seventh Quarterly Technical Report for DOE Cooperative Agreement No.: DE-FC26-00NT41047. The goal of the project is to develop and demonstrate a computational workbench for simulating the performance of Vision 21 Power Plant Systems. Within the last quarter, good progress has been made on the development of the IGCC workbench. A series of parametric CFD simulations for single stage and two stage generic gasifier configurations have been performed. An advanced flowing slag model has been implemented into the CFD based gasifier model. A literature review has been performed on published gasification kinetics. Reactor models have been developed and implemented into the workbench for the majority of the heat exchangers, gas clean up system and power generation system for the Vision 21 reference configuration. Modifications to the software infrastructure of the workbench have been commenced to allow interfacing to the workbench reactor models that utilize the CAPE{_}Open software interface protocol.

  11. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    PubMed

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares. PMID:23294402

  12. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    PubMed

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  13. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    SciTech Connect

    William M. Tang

    2011-02-09

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  14. Computational Materials: Modeling and Simulation of Nanostructured Materials and Systems

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Hinkley, Jeffrey A.

    2003-01-01

    The paper provides details on the structure and implementation of the Computational Materials program at the NASA Langley Research Center. Examples are given that illustrate the suggested approaches to predicting the behavior and influencing the design of nanostructured materials such as high-performance polymers, composites, and nanotube-reinforced polymers. Primary simulation and measurement methods applicable to multi-scale modeling are outlined. Key challenges including verification and validation of models are highlighted and discussed within the context of NASA's broad mission objectives.

  15. Tracking Non-rigid Structures in Computer Simulations

    SciTech Connect

    Gezahegne, A; Kamath, C

    2008-01-10

    A key challenge in tracking moving objects is the correspondence problem, that is, the correct propagation of object labels from one time step to another. This is especially true when the objects are non-rigid structures, changing shape, and merging and splitting over time. In this work, we describe a general approach to tracking thousands of non-rigid structures in an image sequence. We show how we can minimize memory requirements and generate accurate results while working with only two frames of the sequence at a time. We demonstrate our results using data from computer simulations of a fluimix problem.

  16. Computational Strategies for Polymer Coated Steel Sheet Forming Simulations

    SciTech Connect

    Owen, D. R. J.; Andrade Pires, F. M.; Dutko, M.

    2007-05-17

    This contribution discusses current issues involved in the numerical simulation of large scale industrial forming processes that employ polymer coated steel sheet. The need for rigorous consideration of both theoretical and algorithmic issues is emphasized, particularly in relation to the computational treatment of finite strain deformation of polymer coated steel sheet in the presence of internal degradation. Other issues relevant to the effective treatment of the problem, including the modelling of frictional contact between the work piece and tools, low order element technology capable of dealing with plastic incompressibility and thermo mechanical coupling, are also addressed. The suitability of the overall approach is illustrated by the solution of an industrially relevant problem.

  17. Computer Simulation of Einstein-Podolsky-Rosen-Bohm Experiments

    NASA Astrophysics Data System (ADS)

    de Raedt, H.; Michielsen, K.

    2016-07-01

    We review an event-based simulation approach which reproduces the statistical distributions of quantum physics experiments by generating detection events one-by-one according to an unknown distribution and without solving a wave equation. Einstein-Podolsky-Rosen-Bohm laboratory experiments are used as an example to illustrate the applicability of this approach. It is shown that computer experiments that employ the same post-selection procedure as the one used in laboratory experiments produce data that is in excellent agreement with quantum theory.

  18. Computer simulations of human interferon gamma mutated forms

    NASA Astrophysics Data System (ADS)

    Lilkova, E.; Litov, L.; Petkov, P.; Petkov, P.; Markov, S.; Ilieva, N.

    2010-01-01

    In the general framework of the computer-aided drug design, the method of molecular-dynamics simulations is applied for investigation of the human interferon-gamma (hIFN-γ) binding to its two known ligands (its extracellular receptor and the heparin-derived oligosaccharides). A study of 100 mutated hIFN-γ forms is presented, the mutations encompassing residues 86-88. The structural changes are investigated by comparing the lengths of the α-helices, in which these residues are included, in the native hIFN-γ molecule and in the mutated forms. The most intriguing cases are examined in detail.

  19. Computational challenges for beam-beam simulation for RHIC

    SciTech Connect

    Luo, Y.; Fischer, W.

    2010-10-01

    In this article we will review the computational challenges in the beam-beam simulation for the polarized proton run of the Relativistic Heavy Ion Collider (RHIC). The difficulties in our multi-particle and million turn tracking to calculate the proton beam lifetime and proton beam emittance growth due to head-on beam-beam interaction and head-on beam-beam compensation are presented and discussed. Solutions to obtain meaningful physics results from these trackings are proposed and tested. In the end we will present the progress in the benchmarking of the RHIC operational proton beam lifetime.

  20. Time-partitioning simulation models for calculation on parallel computers

    NASA Technical Reports Server (NTRS)

    Milner, Edward J.; Blech, Richard A.; Chima, Rodrick V.

    1987-01-01

    A technique allowing time-staggered solution of partial differential equations is presented in this report. Using this technique, called time-partitioning, simulation execution speedup is proportional to the number of processors used because all processors operate simultaneously, with each updating of the solution grid at a different time point. The technique is limited by neither the number of processors available nor by the dimension of the solution grid. Time-partitioning was used to obtain the flow pattern through a cascade of airfoils, modeled by the Euler partial differential equations. An execution speedup factor of 1.77 was achieved using a two processor Cray X-MP/24 computer.